Hang Lin;Yifan Peng;Yubo Zhang;Lin Bie;Xibin Zhao;Yue Gao
{"title":"Filter Pruning by High-Order Spectral Clustering","authors":"Hang Lin;Yifan Peng;Yubo Zhang;Lin Bie;Xibin Zhao;Yue Gao","doi":"10.1109/TPAMI.2024.3524381","DOIUrl":null,"url":null,"abstract":"Large amount of redundancy is widely present in convolutional neural networks (CNNs). Identifying the redundancy in the network and removing the redundant filters is an effective way to compress the CNN model size with a minimal reduction in performance. However, most of the existing redundancy-based pruning methods only consider the distance information between two filters, which can only model simple correlations between filters. Moreover, we point out that distance-based pruning methods are not applicable for high-dimensional features in CNN models by our experimental observations and analysis. To tackle this issue, we propose a new pruning strategy based on high-order spectral clustering. In this approach, we use hypergraph structure to construct complex correlations among filters, and obtain high-order information among filters by hypergraph structure learning. Finally, based on the high-order information, we can perform better clustering on the filters and remove the redundant filters in each cluster. Experiments on various CNN models and datasets demonstrate that our proposed method outperforms the recent state-of-the-art works. For example, with ResNet50, we achieve a 57.1% FLOPs reduction with no accuracy drop on ImageNet, which is the first to achieve lossless pruning with such a high compression ratio.","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"47 4","pages":"2402-2415"},"PeriodicalIF":0.0000,"publicationDate":"2024-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10819307/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Large amount of redundancy is widely present in convolutional neural networks (CNNs). Identifying the redundancy in the network and removing the redundant filters is an effective way to compress the CNN model size with a minimal reduction in performance. However, most of the existing redundancy-based pruning methods only consider the distance information between two filters, which can only model simple correlations between filters. Moreover, we point out that distance-based pruning methods are not applicable for high-dimensional features in CNN models by our experimental observations and analysis. To tackle this issue, we propose a new pruning strategy based on high-order spectral clustering. In this approach, we use hypergraph structure to construct complex correlations among filters, and obtain high-order information among filters by hypergraph structure learning. Finally, based on the high-order information, we can perform better clustering on the filters and remove the redundant filters in each cluster. Experiments on various CNN models and datasets demonstrate that our proposed method outperforms the recent state-of-the-art works. For example, with ResNet50, we achieve a 57.1% FLOPs reduction with no accuracy drop on ImageNet, which is the first to achieve lossless pruning with such a high compression ratio.