Online Weight Pruning Via Adaptive Sparsity Loss

George Retsinas, Athena Elafrou, G. Goumas, P. Maragos
{"title":"Online Weight Pruning Via Adaptive Sparsity Loss","authors":"George Retsinas, Athena Elafrou, G. Goumas, P. Maragos","doi":"10.1109/ICIP42928.2021.9506301","DOIUrl":null,"url":null,"abstract":"Pruning neural networks has regained interest in recent years as a means to compress state-of-the-art deep neural networks and enable their deployment on resource-constrained devices. In this paper, we propose a robust sparsity controlling framework that efficiently prunes network parameters during training with minimal computational overhead. We incorporate fast mechanisms to prune individual layers and build upon these to automatically prune the entire network under a user-defined budget constraint. Key to our end-to-end network pruning approach is the formulation of an intuitive and easy-to-implement adaptive sparsity loss used to explicitly control sparsity during training, enabling efficient budget-aware optimization.","PeriodicalId":314429,"journal":{"name":"2021 IEEE International Conference on Image Processing (ICIP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP42928.2021.9506301","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Pruning neural networks has regained interest in recent years as a means to compress state-of-the-art deep neural networks and enable their deployment on resource-constrained devices. In this paper, we propose a robust sparsity controlling framework that efficiently prunes network parameters during training with minimal computational overhead. We incorporate fast mechanisms to prune individual layers and build upon these to automatically prune the entire network under a user-defined budget constraint. Key to our end-to-end network pruning approach is the formulation of an intuitive and easy-to-implement adaptive sparsity loss used to explicitly control sparsity during training, enabling efficient budget-aware optimization.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于自适应稀疏度损失的在线权值修剪
近年来,作为一种压缩最先进的深度神经网络并使其能够在资源受限的设备上部署的手段,修剪神经网络重新引起了人们的兴趣。在本文中,我们提出了一个鲁棒的稀疏性控制框架,在训练过程中以最小的计算开销有效地修剪网络参数。我们采用快速机制来修剪单个层,并在这些机制的基础上,在用户定义的预算约束下自动修剪整个网络。我们的端到端网络修剪方法的关键是制定一个直观且易于实现的自适应稀疏性损失,用于在训练期间显式控制稀疏性,从而实现有效的预算感知优化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Deep Color Mismatch Correction In Stereoscopic 3d Images Weakly-Supervised Multiple Object Tracking Via A Masked Center Point Warping Loss A Parameter Efficient Multi-Scale Capsule Network Few Shot Learning For Infra-Red Object Recognition Using Analytically Designed Low Level Filters For Data Representation An Enhanced Reference Structure For Reference Picture Resampling (RPR) In VVC
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1