{"title":"Self-Train: Self-Supervised On-Device Training for Post-Deployment Adaptation","authors":"Jinhao Liu, Xiaofan Yu, T. Simunic","doi":"10.1109/SmartIoT55134.2022.00034","DOIUrl":null,"url":null,"abstract":"Recent years have witnessed a significant increase in deploying lightweight machine learning (ML) on embedded systems. The list of applications range from self-driving vehicles to smart environmental monitoring. However, the performance of ML models after the deployment degrades because of potential drifting of the device or the environment. In this paper, we propose Self-Train, a self-supervised on-device training method for ML models to adapt to post-deployment drifting without labels. Self-Train employs offline contrastive feature learning and online drift detection with self-supervised adaptation. Experiments on images and real-world sensor datasets demonstrate consistent accuracy improvements over state-of-the-art online unsupervised methods with 2.45× at maximum, while maintaining lower execution time with a maximum of 32.7× speedup.","PeriodicalId":422269,"journal":{"name":"2022 IEEE International Conference on Smart Internet of Things (SmartIoT)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Smart Internet of Things (SmartIoT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SmartIoT55134.2022.00034","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recent years have witnessed a significant increase in deploying lightweight machine learning (ML) on embedded systems. The list of applications range from self-driving vehicles to smart environmental monitoring. However, the performance of ML models after the deployment degrades because of potential drifting of the device or the environment. In this paper, we propose Self-Train, a self-supervised on-device training method for ML models to adapt to post-deployment drifting without labels. Self-Train employs offline contrastive feature learning and online drift detection with self-supervised adaptation. Experiments on images and real-world sensor datasets demonstrate consistent accuracy improvements over state-of-the-art online unsupervised methods with 2.45× at maximum, while maintaining lower execution time with a maximum of 32.7× speedup.