PRF:通过系统修剪冗余滤波器压缩深度神经网络

C. H. Sarvani, Mrinmoy Ghorai, S. H. Shabbeer Basha
{"title":"PRF:通过系统修剪冗余滤波器压缩深度神经网络","authors":"C. H. Sarvani, Mrinmoy Ghorai, S. H. Shabbeer Basha","doi":"10.1007/s00521-024-10256-5","DOIUrl":null,"url":null,"abstract":"<p>In deep neural networks, the filters of convolutional layers play an important role in extracting the features from the input. Redundant filters often extract similar features, leading to increased computational overhead and larger model size. To address this issue, a two-step approach is proposed in this paper. First, the clusters of redundant filters are identified based on the cosine distance between them using hierarchical agglomerative clustering (HAC). Next, instead of pruning all the redundant filters from every cluster in single-shot, we propose to prune the filters in a systematic manner. To prune the filters, the cluster importance among all clusters and filter importance within each cluster are identified using the <span>\\(\\ell _1\\)</span>-norm based criterion. Then, based on the pruning ratio filters from the least important cluster to the most important ones are pruned systematically. The proposed method showed better results compared to other clustering-based works. The benchmark datasets CIFAR-10 and ImageNet are used in the experiments. After pruning 83.92% parameters from VGG-16 architecture, an improvement over the baseline is observed. After pruning 54.59% and 49.33% of the FLOPs from ResNet-56 and ResNet-110, respectively, both showed an improvement in accuracy. After pruning 52.97% of the FLOPs, the top-5 accuracy of ResNet-50 drops by only 0.56 over ImageNet.</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"PRF: deep neural network compression by systematic pruning of redundant filters\",\"authors\":\"C. H. Sarvani, Mrinmoy Ghorai, S. H. Shabbeer Basha\",\"doi\":\"10.1007/s00521-024-10256-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In deep neural networks, the filters of convolutional layers play an important role in extracting the features from the input. Redundant filters often extract similar features, leading to increased computational overhead and larger model size. To address this issue, a two-step approach is proposed in this paper. First, the clusters of redundant filters are identified based on the cosine distance between them using hierarchical agglomerative clustering (HAC). Next, instead of pruning all the redundant filters from every cluster in single-shot, we propose to prune the filters in a systematic manner. To prune the filters, the cluster importance among all clusters and filter importance within each cluster are identified using the <span>\\\\(\\\\ell _1\\\\)</span>-norm based criterion. Then, based on the pruning ratio filters from the least important cluster to the most important ones are pruned systematically. The proposed method showed better results compared to other clustering-based works. The benchmark datasets CIFAR-10 and ImageNet are used in the experiments. After pruning 83.92% parameters from VGG-16 architecture, an improvement over the baseline is observed. After pruning 54.59% and 49.33% of the FLOPs from ResNet-56 and ResNet-110, respectively, both showed an improvement in accuracy. After pruning 52.97% of the FLOPs, the top-5 accuracy of ResNet-50 drops by only 0.56 over ImageNet.</p>\",\"PeriodicalId\":18925,\"journal\":{\"name\":\"Neural Computing and Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computing and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00521-024-10256-5\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10256-5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在深度神经网络中,卷积层的滤波器在从输入中提取特征方面发挥着重要作用。冗余滤波器通常会提取相似的特征,从而导致计算开销增加和模型体积增大。为解决这一问题,本文提出了一种分两步走的方法。首先,利用分层聚类(HAC)技术,根据冗余过滤器之间的余弦距离确定它们的聚类。接下来,我们不再一次性剪除每个簇中的所有冗余滤波器,而是提议以系统化的方式剪除滤波器。为了剪切过滤器,我们使用基于 \(\ell _1\)-norm的准则来确定所有聚类中的聚类重要性和每个聚类中过滤器的重要性。然后,根据剪枝率,从最不重要的簇到最重要的簇,对过滤器进行系统剪枝。与其他基于聚类的方法相比,所提出的方法取得了更好的效果。实验中使用了基准数据集 CIFAR-10 和 ImageNet。从 VGG-16 架构中剪枝 83.92% 的参数后,观察到比基线有所改进。在对 ResNet-56 和 ResNet-110 分别剪枝 54.59% 和 49.33% 的 FLOP 后,两者的准确率都有所提高。在剪枝 52.97% 的 FLOP 后,ResNet-50 的前五名准确率仅比 ImageNet 降低了 0.56。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
PRF: deep neural network compression by systematic pruning of redundant filters

In deep neural networks, the filters of convolutional layers play an important role in extracting the features from the input. Redundant filters often extract similar features, leading to increased computational overhead and larger model size. To address this issue, a two-step approach is proposed in this paper. First, the clusters of redundant filters are identified based on the cosine distance between them using hierarchical agglomerative clustering (HAC). Next, instead of pruning all the redundant filters from every cluster in single-shot, we propose to prune the filters in a systematic manner. To prune the filters, the cluster importance among all clusters and filter importance within each cluster are identified using the \(\ell _1\)-norm based criterion. Then, based on the pruning ratio filters from the least important cluster to the most important ones are pruned systematically. The proposed method showed better results compared to other clustering-based works. The benchmark datasets CIFAR-10 and ImageNet are used in the experiments. After pruning 83.92% parameters from VGG-16 architecture, an improvement over the baseline is observed. After pruning 54.59% and 49.33% of the FLOPs from ResNet-56 and ResNet-110, respectively, both showed an improvement in accuracy. After pruning 52.97% of the FLOPs, the top-5 accuracy of ResNet-50 drops by only 0.56 over ImageNet.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Potential analysis of radiographic images to determine infestation of rice seeds Recommendation systems with user and item profiles based on symbolic modal data End-to-end entity extraction from OCRed texts using summarization models Firearm detection using DETR with multiple self-coordinated neural networks Automated defect identification in coherent diffraction imaging with smart continual learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1