{"title":"基于动态稀疏率的SNIP预剪枝算法优化","authors":"Jianjun Wang, Ximeng Pan, Wanqing Li, Min Zhang","doi":"10.1109/ITCA52113.2020.00032","DOIUrl":null,"url":null,"abstract":"As a key parameter of network pruning, the sparsity rate determines the sparsity effect after network pruning, and is closely related to the complexity, accuracy and application of neural network. Therefore, the determination of neural network sparsity rate has become one of the research hotspots. After reading a large number of relevant literatures, it is found that in the process of model training, the value of sparsity rate is often set artificially according to experience. So that the sparsity rate cannot change dynamically with the change of experimental environment and data, and the accuracy of its value is difficult to determine. To solve the above problems, this paper introduces dynamic sparsity rate, optimizes the SNIP pre-pruning algorithm, and proposes the PDSR algorithm. It calculates the sparsity rate dynamically according to the connection sensitivity of weights and realizes the pre-pruning of neural networks. Experimental results on various convolutional neural networks show that compared with SNIP algorithm, the PDSR algorithm has obvious improvement in accuracy rate and operation efficiency.","PeriodicalId":103309,"journal":{"name":"2020 2nd International Conference on Information Technology and Computer Application (ITCA)","volume":"108 47","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"PDSR: Optimization of SNIP Pre-Pruning Algorithm Based On Dynamic Sparsity Rate\",\"authors\":\"Jianjun Wang, Ximeng Pan, Wanqing Li, Min Zhang\",\"doi\":\"10.1109/ITCA52113.2020.00032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As a key parameter of network pruning, the sparsity rate determines the sparsity effect after network pruning, and is closely related to the complexity, accuracy and application of neural network. Therefore, the determination of neural network sparsity rate has become one of the research hotspots. After reading a large number of relevant literatures, it is found that in the process of model training, the value of sparsity rate is often set artificially according to experience. So that the sparsity rate cannot change dynamically with the change of experimental environment and data, and the accuracy of its value is difficult to determine. To solve the above problems, this paper introduces dynamic sparsity rate, optimizes the SNIP pre-pruning algorithm, and proposes the PDSR algorithm. It calculates the sparsity rate dynamically according to the connection sensitivity of weights and realizes the pre-pruning of neural networks. Experimental results on various convolutional neural networks show that compared with SNIP algorithm, the PDSR algorithm has obvious improvement in accuracy rate and operation efficiency.\",\"PeriodicalId\":103309,\"journal\":{\"name\":\"2020 2nd International Conference on Information Technology and Computer Application (ITCA)\",\"volume\":\"108 47\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 2nd International Conference on Information Technology and Computer Application (ITCA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITCA52113.2020.00032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd International Conference on Information Technology and Computer Application (ITCA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITCA52113.2020.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
PDSR: Optimization of SNIP Pre-Pruning Algorithm Based On Dynamic Sparsity Rate
As a key parameter of network pruning, the sparsity rate determines the sparsity effect after network pruning, and is closely related to the complexity, accuracy and application of neural network. Therefore, the determination of neural network sparsity rate has become one of the research hotspots. After reading a large number of relevant literatures, it is found that in the process of model training, the value of sparsity rate is often set artificially according to experience. So that the sparsity rate cannot change dynamically with the change of experimental environment and data, and the accuracy of its value is difficult to determine. To solve the above problems, this paper introduces dynamic sparsity rate, optimizes the SNIP pre-pruning algorithm, and proposes the PDSR algorithm. It calculates the sparsity rate dynamically according to the connection sensitivity of weights and realizes the pre-pruning of neural networks. Experimental results on various convolutional neural networks show that compared with SNIP algorithm, the PDSR algorithm has obvious improvement in accuracy rate and operation efficiency.