Michael Beyer, Christoph Schorn, T. Fabarisov, A. Morozov, K. Janschek
{"title":"Automated Hardening of Deep Neural Network Architectures","authors":"Michael Beyer, Christoph Schorn, T. Fabarisov, A. Morozov, K. Janschek","doi":"10.1115/imece2021-72891","DOIUrl":null,"url":null,"abstract":"\n Designing optimal neural network (NN) architectures is a difficult and time-consuming task, especially when error resiliency and hardware efficiency are considered simultaneously. In our paper, we extend neural architecture search (NAS) to also optimize a NN’s error resilience and hardware related metrics in addition to classification accuarcy. To this end, we consider the error sensitivity of a NN on the architecture-level during NAS and additionally incorporate checksums into the network as an external error detection mechanism. With an additional computational overhead as low as 17% for the discovered architectures, checksums are an efficient method to effectively enhance the error resilience of NNs. Furthermore, the results show that cell-based NN architectures are able to maintain their error resilience characteristics when transferred to other tasks.","PeriodicalId":146533,"journal":{"name":"Volume 13: Safety Engineering, Risk, and Reliability Analysis; Research Posters","volume":"198 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Volume 13: Safety Engineering, Risk, and Reliability Analysis; Research Posters","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/imece2021-72891","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Designing optimal neural network (NN) architectures is a difficult and time-consuming task, especially when error resiliency and hardware efficiency are considered simultaneously. In our paper, we extend neural architecture search (NAS) to also optimize a NN’s error resilience and hardware related metrics in addition to classification accuarcy. To this end, we consider the error sensitivity of a NN on the architecture-level during NAS and additionally incorporate checksums into the network as an external error detection mechanism. With an additional computational overhead as low as 17% for the discovered architectures, checksums are an efficient method to effectively enhance the error resilience of NNs. Furthermore, the results show that cell-based NN architectures are able to maintain their error resilience characteristics when transferred to other tasks.