A Novel Pooling Method for Regularization of Deep Neural networks

El houssaine Hssayni, M. Ettaouil
{"title":"A Novel Pooling Method for Regularization of Deep Neural networks","authors":"El houssaine Hssayni, M. Ettaouil","doi":"10.1109/ISCV49265.2020.9204322","DOIUrl":null,"url":null,"abstract":"Dropout has been introduced as a powerful regularization approach to preventing overfitting problem in deep neural networks, particularly in deep convolutional neural networks(DCNNs). A number of methods have been designed recently to improve and generalize the dropout technique. These methods include spectral dropout which achieves improved generalization and avoids overfitting by eliminating noisy and weak Fourier domain coefficients of the neural network activations. On the other hand, a pooling process plays a crucial role in deep convolutional neural networks, which serves to reduce the dimensionality of processed data for decreasing computational cost as well as for avoiding overfitting and enhancing the generalization capability of the network. For this reason, we focus on the pooling layer, and we propose a new pooling method called Spectral Dropout Pooling, by applying the Spectral dropout technique in the pooling region, in order to avoid overfitting problem, as well as to enhance the generalization ability of DCNNs. Experimental results on several image benchmarks show that Spectral Dropout Pooling outperforms the existing pooling methods in classification performance as well as is effective for improving the generalization ability of DCNNs. Moreover, we show that Spectral Dropout Pooling combined with other regularization methods, such as batch normalization, is competitive with other existing methods in classification performance.","PeriodicalId":313743,"journal":{"name":"2020 International Conference on Intelligent Systems and Computer Vision (ISCV)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Intelligent Systems and Computer Vision (ISCV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCV49265.2020.9204322","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Dropout has been introduced as a powerful regularization approach to preventing overfitting problem in deep neural networks, particularly in deep convolutional neural networks(DCNNs). A number of methods have been designed recently to improve and generalize the dropout technique. These methods include spectral dropout which achieves improved generalization and avoids overfitting by eliminating noisy and weak Fourier domain coefficients of the neural network activations. On the other hand, a pooling process plays a crucial role in deep convolutional neural networks, which serves to reduce the dimensionality of processed data for decreasing computational cost as well as for avoiding overfitting and enhancing the generalization capability of the network. For this reason, we focus on the pooling layer, and we propose a new pooling method called Spectral Dropout Pooling, by applying the Spectral dropout technique in the pooling region, in order to avoid overfitting problem, as well as to enhance the generalization ability of DCNNs. Experimental results on several image benchmarks show that Spectral Dropout Pooling outperforms the existing pooling methods in classification performance as well as is effective for improving the generalization ability of DCNNs. Moreover, we show that Spectral Dropout Pooling combined with other regularization methods, such as batch normalization, is competitive with other existing methods in classification performance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种新的深度神经网络正则化池化方法
Dropout作为一种有效的正则化方法被引入深度神经网络,特别是深度卷积神经网络(DCNNs)中,以防止过拟合问题。最近已经设计了许多方法来改进和推广dropout技术。这些方法包括谱dropout,它通过消除神经网络激活的噪声和弱傅立叶域系数来实现改进的泛化和避免过拟合。另一方面,池化过程在深度卷积神经网络中起着至关重要的作用,它可以降低处理数据的维数,从而降低计算成本,避免过拟合,增强网络的泛化能力。为此,我们着眼于池化层,提出了一种新的池化方法——谱Dropout池化,通过在池化区域应用谱Dropout技术,避免了过拟合问题,同时增强了DCNNs的泛化能力。在多个图像基准上的实验结果表明,谱Dropout池化方法在分类性能上优于现有的池化方法,能够有效地提高DCNNs的泛化能力。此外,我们还表明,谱Dropout池与其他正则化方法(如批归一化)相结合,在分类性能上与其他现有方法具有竞争力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Survey on how computer vision can response to urgent need to contribute in COVID-19 pandemics Toward Classification of Arabic Manuscripts Words Based on the Deep Convolutional Neural Networks Sharing Emotions in the Distance Education Experience: Attitudes and Motivation of University Students k-eNSC: k-estimation for Normalized Spectral Clustering Effective CU size decision algorithm based on depth map homogeneity for 3D-HEVC inter-coding
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1