{"title":"基于增强稀疏训练和优化剪枝的网络瘦身","authors":"Ziliang Guo, Xueming Li","doi":"10.1145/3446132.3446159","DOIUrl":null,"url":null,"abstract":"Previous works use a similar process to prune channels: train, prune, fine-tune. In this paper, we treat channel pruning as a method of network architecture search. Specifically, we limit the search space by adding some conditions on it, and after searching, we only reserve the architecture of the network and train it from scratch. We train the model with augmented sparsity to get a higher ratio of pruning. During pruning, we add a protect threshold to prevent the pruned model from being disconnection. Our process of channel pruning is as follows: train with sparsity, prune, train from scratch. we verified the effectiveness of our method on several models, including VGGNet, ResNet and DenseNet on various datasets. Otherwise, we test our method on different architectures of ResNet and analyze the results on both models.","PeriodicalId":125388,"journal":{"name":"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Network Slimming with Augmented Sparse Training and Optimized Pruning\",\"authors\":\"Ziliang Guo, Xueming Li\",\"doi\":\"10.1145/3446132.3446159\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Previous works use a similar process to prune channels: train, prune, fine-tune. In this paper, we treat channel pruning as a method of network architecture search. Specifically, we limit the search space by adding some conditions on it, and after searching, we only reserve the architecture of the network and train it from scratch. We train the model with augmented sparsity to get a higher ratio of pruning. During pruning, we add a protect threshold to prevent the pruned model from being disconnection. Our process of channel pruning is as follows: train with sparsity, prune, train from scratch. we verified the effectiveness of our method on several models, including VGGNet, ResNet and DenseNet on various datasets. Otherwise, we test our method on different architectures of ResNet and analyze the results on both models.\",\"PeriodicalId\":125388,\"journal\":{\"name\":\"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3446132.3446159\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3446132.3446159","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Network Slimming with Augmented Sparse Training and Optimized Pruning
Previous works use a similar process to prune channels: train, prune, fine-tune. In this paper, we treat channel pruning as a method of network architecture search. Specifically, we limit the search space by adding some conditions on it, and after searching, we only reserve the architecture of the network and train it from scratch. We train the model with augmented sparsity to get a higher ratio of pruning. During pruning, we add a protect threshold to prevent the pruned model from being disconnection. Our process of channel pruning is as follows: train with sparsity, prune, train from scratch. we verified the effectiveness of our method on several models, including VGGNet, ResNet and DenseNet on various datasets. Otherwise, we test our method on different architectures of ResNet and analyze the results on both models.