Filter Pruning with Convolutional Approximation Small Model Framework

IF 1.9 Q2 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS Computation Pub Date : 2023-09-05 DOI:10.3390/computation11090176
Monthon Intraraprasit, O. Chitsobhuk
{"title":"Filter Pruning with Convolutional Approximation Small Model Framework","authors":"Monthon Intraraprasit, O. Chitsobhuk","doi":"10.3390/computation11090176","DOIUrl":null,"url":null,"abstract":"Convolutional neural networks (CNNs) are extensively utilized in computer vision; however, they pose challenges in terms of computational time and storage requirements. To address this issue, one well-known approach is filter pruning. However, fine-tuning pruned models necessitates substantial computing power and a large retraining dataset. To restore model performance after pruning each layer, we propose the Convolutional Approximation Small Model (CASM) framework. CASM involves training a compact model with the remaining kernels and optimizing their weights to restore feature maps that resemble the original kernels. This method requires less complexity and fewer training samples compared to basic fine-tuning. We evaluate the performance of CASM on the CIFAR-10 and ImageNet datasets using VGG-16 and ResNet-50 models. The experimental results demonstrate that CASM surpasses the basic fine-tuning framework in terms of time acceleration (3.3× faster), requiring a smaller dataset for performance recovery after pruning, and achieving enhanced accuracy.","PeriodicalId":52148,"journal":{"name":"Computation","volume":" ","pages":""},"PeriodicalIF":1.9000,"publicationDate":"2023-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/computation11090176","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Convolutional neural networks (CNNs) are extensively utilized in computer vision; however, they pose challenges in terms of computational time and storage requirements. To address this issue, one well-known approach is filter pruning. However, fine-tuning pruned models necessitates substantial computing power and a large retraining dataset. To restore model performance after pruning each layer, we propose the Convolutional Approximation Small Model (CASM) framework. CASM involves training a compact model with the remaining kernels and optimizing their weights to restore feature maps that resemble the original kernels. This method requires less complexity and fewer training samples compared to basic fine-tuning. We evaluate the performance of CASM on the CIFAR-10 and ImageNet datasets using VGG-16 and ResNet-50 models. The experimental results demonstrate that CASM surpasses the basic fine-tuning framework in terms of time acceleration (3.3× faster), requiring a smaller dataset for performance recovery after pruning, and achieving enhanced accuracy.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
卷积逼近小模型框架下的滤波器修剪
卷积神经网络(cnn)在计算机视觉中得到了广泛的应用;然而,它们在计算时间和存储需求方面提出了挑战。为了解决这个问题,一个众所周知的方法是过滤器修剪。然而,微调修剪模型需要大量的计算能力和大量的再训练数据集。为了在修剪每一层后恢复模型的性能,我们提出了卷积逼近小模型(CASM)框架。CASM包括用剩余的核训练一个紧凑的模型,并优化它们的权重,以恢复与原始核相似的特征映射。与基本的微调相比,该方法需要更少的复杂性和更少的训练样本。我们使用VGG-16和ResNet-50模型评估了CASM在CIFAR-10和ImageNet数据集上的性能。实验结果表明,CASM在时间加速方面优于基本的微调框架(快3.3倍),需要更小的数据集进行修剪后的性能恢复,并且实现了更高的精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computation
Computation Mathematics-Applied Mathematics
CiteScore
3.50
自引率
4.50%
发文量
201
审稿时长
8 weeks
期刊介绍: Computation a journal of computational science and engineering. Topics: computational biology, including, but not limited to: bioinformatics mathematical modeling, simulation and prediction of nucleic acid (DNA/RNA) and protein sequences, structure and functions mathematical modeling of pathways and genetic interactions neuroscience computation including neural modeling, brain theory and neural networks computational chemistry, including, but not limited to: new theories and methodology including their applications in molecular dynamics computation of electronic structure density functional theory designing and characterization of materials with computation method computation in engineering, including, but not limited to: new theories, methodology and the application of computational fluid dynamics (CFD) optimisation techniques and/or application of optimisation to multidisciplinary systems system identification and reduced order modelling of engineering systems parallel algorithms and high performance computing in engineering.
期刊最新文献
Analytical and Numerical Investigation of Two-Dimensional Heat Transfer with Periodic Boundary Conditions Enhancement of Machine-Learning-Based Flash Calculations near Criticality Using a Resampling Approach Corporate Bankruptcy Prediction Models: A Comparative Study for the Construction Sector in Greece Analysis of Effectiveness of Combined Surface Treatment Methods for Structural Parts with Holes to Enhance Their Fatigue Life A New Mixed Fractional Derivative with Applications in Computational Biology
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1