{"title":"Test-cost-sensitive Convolutional Neural Networks with Expert Branches","authors":"Mahdi Naghibi, R. Anvari, A. Forghani, B. Minaei","doi":"10.5121/sipij.2019.10502","DOIUrl":null,"url":null,"abstract":"It has been proven that deeper convolutional neural networks (CNN) can result in better accuracy in many problems, but this accuracy comes with a high computational cost. Also, input instances have not the same difficulty. As a solution for accuracy vs. computational cost dilemma, we introduce a new test-cost-sensitive method for convolutional neural networks. This method trains a CNN with a set of auxiliary outputs and expert branches in some middle layers of the network. The expert branches decide to use a shallower part of the network or going deeper to the end, based on the difficulty of input instance. The expert branches learn to determine: is the current network prediction is wrong and if the given instance passed to deeper layers of the network it will generate right output; If not, then the expert branches stop the computation process. The experimental results on standard dataset CIFAR-10 show that the proposed method can train models with lower test-cost and competitive accuracy in comparison with basic models.","PeriodicalId":90726,"journal":{"name":"Signal and image processing : an international journal","volume":"72 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal and image processing : an international journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/sipij.2019.10502","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
It has been proven that deeper convolutional neural networks (CNN) can result in better accuracy in many problems, but this accuracy comes with a high computational cost. Also, input instances have not the same difficulty. As a solution for accuracy vs. computational cost dilemma, we introduce a new test-cost-sensitive method for convolutional neural networks. This method trains a CNN with a set of auxiliary outputs and expert branches in some middle layers of the network. The expert branches decide to use a shallower part of the network or going deeper to the end, based on the difficulty of input instance. The expert branches learn to determine: is the current network prediction is wrong and if the given instance passed to deeper layers of the network it will generate right output; If not, then the expert branches stop the computation process. The experimental results on standard dataset CIFAR-10 show that the proposed method can train models with lower test-cost and competitive accuracy in comparison with basic models.