Qinglong Zhang;Rui Han;Chi Harold Liu;Guoren Wang;Lydia Y. Chen
{"title":"ElasticDNN: On-Device Neural Network Remodeling for Adapting Evolving Vision Domains at Edge","authors":"Qinglong Zhang;Rui Han;Chi Harold Liu;Guoren Wang;Lydia Y. Chen","doi":"10.1109/TC.2024.3375608","DOIUrl":null,"url":null,"abstract":"Executing deep neural networks (DNN) based vision tasks on edge devices encounters challenging scenarios of significant and continually evolving data domains (e.g. background or subpopulation shift). With limited resources, the state-of-the-art domain adaptation (DA) methods either cause high training overheads on large DNN models, or incur significant accuracy losses when adapting small/compressed models in an online fashion. The inefficient resource scheduling among multiple applications further degrades their overall model accuracy. In this paper, we present ElasticDNN, a framework that enables online DNN remodeling for applications encountering evolving domain drifts at edge. Its first key component is the master-surrogate DNN models, which can dynamically generate a small surrogate DNN by retaining and training the large master DNN's most relevant regions pertinent to the new domain. The second novelty of ElasticDNN is the filter-grained resource scheduling, which allocates GPU resources based on online accuracy estimation and DNN remodeling of co-running applications. We fully implement ElasticDNN and demonstrate its effectiveness through extensive experiments. The results show that, compared to existing online DA methods using the same model sizes, ElasticDNN improves accuracy by 23.31% and reduces adaption time by 35.67x. In the more challenging multi-application scenario, ElasticDNN improves accuracy by an average of 25.91%.","PeriodicalId":13087,"journal":{"name":"IEEE Transactions on Computers","volume":"73 6","pages":"1616-1630"},"PeriodicalIF":3.6000,"publicationDate":"2024-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Computers","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10466390/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Executing deep neural networks (DNN) based vision tasks on edge devices encounters challenging scenarios of significant and continually evolving data domains (e.g. background or subpopulation shift). With limited resources, the state-of-the-art domain adaptation (DA) methods either cause high training overheads on large DNN models, or incur significant accuracy losses when adapting small/compressed models in an online fashion. The inefficient resource scheduling among multiple applications further degrades their overall model accuracy. In this paper, we present ElasticDNN, a framework that enables online DNN remodeling for applications encountering evolving domain drifts at edge. Its first key component is the master-surrogate DNN models, which can dynamically generate a small surrogate DNN by retaining and training the large master DNN's most relevant regions pertinent to the new domain. The second novelty of ElasticDNN is the filter-grained resource scheduling, which allocates GPU resources based on online accuracy estimation and DNN remodeling of co-running applications. We fully implement ElasticDNN and demonstrate its effectiveness through extensive experiments. The results show that, compared to existing online DA methods using the same model sizes, ElasticDNN improves accuracy by 23.31% and reduces adaption time by 35.67x. In the more challenging multi-application scenario, ElasticDNN improves accuracy by an average of 25.91%.
期刊介绍:
The IEEE Transactions on Computers is a monthly publication with a wide distribution to researchers, developers, technical managers, and educators in the computer field. It publishes papers on research in areas of current interest to the readers. These areas include, but are not limited to, the following: a) computer organizations and architectures; b) operating systems, software systems, and communication protocols; c) real-time systems and embedded systems; d) digital devices, computer components, and interconnection networks; e) specification, design, prototyping, and testing methods and tools; f) performance, fault tolerance, reliability, security, and testability; g) case studies and experimental and theoretical evaluations; and h) new and important applications and trends.