{"title":"Adam Optimized Pruning Method for Few-Shot Learning with GAN Data Augmentation","authors":"Shi Qirui, Chen Hongle, Chen Juan, Wen Quan","doi":"10.1109/ICCWAMTIP53232.2021.9674082","DOIUrl":null,"url":null,"abstract":"Weight pruning is widely used for model compression and acceleration. In this work, a novel Adam optimization method for few-shot learning with GAN data augmentation is proposed. The first-order Taylor series is implemented to evaluate parameters' importance toward the loss function. And with the given compression ratio, parameters with importance above the threshold are updated by the Adam optimizer with momentum-accelerated weight decay, while others have negative updates as the penalization. After continuous iterations, the model enables to achieve corresponding sparsity ratio, with the influence of the redundant parameters reducing to a low extent. Experiments demonstrate that this method is effective on ResNet with CUB and ISIC-2018 datasets. Note that CUB and ISIC-2018 are datasets about birds and skin deceases, respectively, which represents the generalization of our method on cross-domain classification issues. As a result, the pruned model is able to retain the accuracy with high model sparse ratios. And in some specific compress ratio, like 10× for CUB dataset and 3 × for ISIC-2018 dataset, the pruned model even outperforms the origin model by 3.15% and 1.16%, respectively.","PeriodicalId":358772,"journal":{"name":"2021 18th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 18th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCWAMTIP53232.2021.9674082","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Weight pruning is widely used for model compression and acceleration. In this work, a novel Adam optimization method for few-shot learning with GAN data augmentation is proposed. The first-order Taylor series is implemented to evaluate parameters' importance toward the loss function. And with the given compression ratio, parameters with importance above the threshold are updated by the Adam optimizer with momentum-accelerated weight decay, while others have negative updates as the penalization. After continuous iterations, the model enables to achieve corresponding sparsity ratio, with the influence of the redundant parameters reducing to a low extent. Experiments demonstrate that this method is effective on ResNet with CUB and ISIC-2018 datasets. Note that CUB and ISIC-2018 are datasets about birds and skin deceases, respectively, which represents the generalization of our method on cross-domain classification issues. As a result, the pruned model is able to retain the accuracy with high model sparse ratios. And in some specific compress ratio, like 10× for CUB dataset and 3 × for ISIC-2018 dataset, the pruned model even outperforms the origin model by 3.15% and 1.16%, respectively.