Roberto G. Pacheco;Divya J. Bajpai;Mark Shifrin;Rodrigo S. Couto;Daniel Sadoc Menasché;Manjesh K. Hanawal;Miguel Elias M. Campista
{"title":"UCBEE: A Multi Armed Bandit Approach for Early-Exit in Neural Networks","authors":"Roberto G. Pacheco;Divya J. Bajpai;Mark Shifrin;Rodrigo S. Couto;Daniel Sadoc Menasché;Manjesh K. Hanawal;Miguel Elias M. Campista","doi":"10.1109/TNSM.2024.3479076","DOIUrl":null,"url":null,"abstract":"Deep Neural Networks (DNNs) have demonstrated exceptional performance in diverse tasks. However, deploying DNNs on resource-constrained devices presents challenges due to energy consumption and delay overheads. To mitigate these issues, early-exit DNNs (EE-DNNs) incorporate exit branches within intermediate layers to enable early inferences. These branches estimate prediction confidence and employ a fixed threshold to determine early termination. Nonetheless, fixed thresholds yield suboptimal performance in dynamic contexts, where context refers to distortions caused by environmental conditions, in image classification, or variations in input distribution due to concept drift, in NLP. In this article, we introduce Upper Confidence Bound in EE-DNNs (UCBEE), an online algorithm that dynamically adjusts early exit thresholds based on context. UCBEE leverages confidence levels at intermediate layers and learns without the need for true labels. Through extensive experiments in image classification and NLP, we demonstrate that UCBEE achieves logarithmic regret, converging after just a few thousand observations across multiple contexts. We evaluate UCBEE for image classification and text mining. In the latter, we show that UCBEE can reduce cumulative regret and lower latency by approximately 10%–20% without compromising accuracy when compared to fixed threshold alternatives. Our findings highlight UCBEE as an effective method for enhancing EE-DNN efficiency.","PeriodicalId":13423,"journal":{"name":"IEEE Transactions on Network and Service Management","volume":"22 1","pages":"107-120"},"PeriodicalIF":4.7000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Network and Service Management","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10714362/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Deep Neural Networks (DNNs) have demonstrated exceptional performance in diverse tasks. However, deploying DNNs on resource-constrained devices presents challenges due to energy consumption and delay overheads. To mitigate these issues, early-exit DNNs (EE-DNNs) incorporate exit branches within intermediate layers to enable early inferences. These branches estimate prediction confidence and employ a fixed threshold to determine early termination. Nonetheless, fixed thresholds yield suboptimal performance in dynamic contexts, where context refers to distortions caused by environmental conditions, in image classification, or variations in input distribution due to concept drift, in NLP. In this article, we introduce Upper Confidence Bound in EE-DNNs (UCBEE), an online algorithm that dynamically adjusts early exit thresholds based on context. UCBEE leverages confidence levels at intermediate layers and learns without the need for true labels. Through extensive experiments in image classification and NLP, we demonstrate that UCBEE achieves logarithmic regret, converging after just a few thousand observations across multiple contexts. We evaluate UCBEE for image classification and text mining. In the latter, we show that UCBEE can reduce cumulative regret and lower latency by approximately 10%–20% without compromising accuracy when compared to fixed threshold alternatives. Our findings highlight UCBEE as an effective method for enhancing EE-DNN efficiency.
期刊介绍:
IEEE Transactions on Network and Service Management will publish (online only) peerreviewed archival quality papers that advance the state-of-the-art and practical applications of network and service management. Theoretical research contributions (presenting new concepts and techniques) and applied contributions (reporting on experiences and experiments with actual systems) will be encouraged. These transactions will focus on the key technical issues related to: Management Models, Architectures and Frameworks; Service Provisioning, Reliability and Quality Assurance; Management Functions; Enabling Technologies; Information and Communication Models; Policies; Applications and Case Studies; Emerging Technologies and Standards.