{"title":"Pruning Deep Feature Networks Using Channel Importance Propagation","authors":"Honglin Chen, Chunting Li","doi":"10.1109/ICCSMT54525.2021.00080","DOIUrl":null,"url":null,"abstract":"Deep convolutional neural networks use their powerful feature representation capability to extract deep information of the targets, which is conducive to the improvement of model accuracy. However, its model is more complex, with a heavier computational burden and greater demand on computational and memory resources, which affects the real-time performance and lightness of the model. To address the above limitations of deep convolutional neural networks, we define a new metric for measuring the importance of convolutional kernels in conjunction with feature maps, introduce a non-linear mapping function that maps feature maps to important convolutional kernels, propose a continuous and smooth pruning strategy for deep convolutional neural networks, and obtain the Pruning deep feature networks using channel importance propagation model to reduce the complexity of the network and reduce the computational burden, and improve the accuracy and training efficiency of the model, while ensuring the feature network representation capability and the system performance loss is small. Our proposed model was tested on three datasets, CIFAR-10, CIFAR-100 and SVHN, and the test results demonstrated the validity of the model.","PeriodicalId":304337,"journal":{"name":"2021 2nd International Conference on Computer Science and Management Technology (ICCSMT)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 2nd International Conference on Computer Science and Management Technology (ICCSMT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSMT54525.2021.00080","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Deep convolutional neural networks use their powerful feature representation capability to extract deep information of the targets, which is conducive to the improvement of model accuracy. However, its model is more complex, with a heavier computational burden and greater demand on computational and memory resources, which affects the real-time performance and lightness of the model. To address the above limitations of deep convolutional neural networks, we define a new metric for measuring the importance of convolutional kernels in conjunction with feature maps, introduce a non-linear mapping function that maps feature maps to important convolutional kernels, propose a continuous and smooth pruning strategy for deep convolutional neural networks, and obtain the Pruning deep feature networks using channel importance propagation model to reduce the complexity of the network and reduce the computational burden, and improve the accuracy and training efficiency of the model, while ensuring the feature network representation capability and the system performance loss is small. Our proposed model was tested on three datasets, CIFAR-10, CIFAR-100 and SVHN, and the test results demonstrated the validity of the model.