输入带噪声的前馈神经网络的学习

A. Seghouane, Y. Moudden, G. Fleury
{"title":"输入带噪声的前馈神经网络的学习","authors":"A. Seghouane, Y. Moudden, G. Fleury","doi":"10.1109/NNSP.2002.1030026","DOIUrl":null,"url":null,"abstract":"Injecting noise to the inputs during the training of feedforward neural networks (FNN) can improve their generalization performance remarkably. Reported works justify this fact arguing that noise injection is equivalent to a smoothing regularization with the input noise variance playing the role of the regularization parameter. The success of this approach depends on the appropriate choice of the input noise variance. However, it is often not known a priori if the degree of smoothness imposed on the FNN mapping is consistent with the unknown function to be approximated. In order to have a better control over this smoothing effect, a cost function putting in balance the smoothed fitting induced by the noise injection and the precision of approximation, is proposed. The second term, which aims at penalizing the undesirable effect of input noise injection or controlling the deviation of the random perturbed cost, was obtained by expressing a certain distance between the original cost function and its random perturbed version. In fact, this term can be derived in general for parametrical. models that satisfy the Lipschitz property. An example is included to illustrate the effectiveness of learning with this proposed cost function when noise injection is used.","PeriodicalId":117945,"journal":{"name":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","volume":"174 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"On learning feedforward neural networks with noise injection into inputs\",\"authors\":\"A. Seghouane, Y. Moudden, G. Fleury\",\"doi\":\"10.1109/NNSP.2002.1030026\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Injecting noise to the inputs during the training of feedforward neural networks (FNN) can improve their generalization performance remarkably. Reported works justify this fact arguing that noise injection is equivalent to a smoothing regularization with the input noise variance playing the role of the regularization parameter. The success of this approach depends on the appropriate choice of the input noise variance. However, it is often not known a priori if the degree of smoothness imposed on the FNN mapping is consistent with the unknown function to be approximated. In order to have a better control over this smoothing effect, a cost function putting in balance the smoothed fitting induced by the noise injection and the precision of approximation, is proposed. The second term, which aims at penalizing the undesirable effect of input noise injection or controlling the deviation of the random perturbed cost, was obtained by expressing a certain distance between the original cost function and its random perturbed version. In fact, this term can be derived in general for parametrical. models that satisfy the Lipschitz property. An example is included to illustrate the effectiveness of learning with this proposed cost function when noise injection is used.\",\"PeriodicalId\":117945,\"journal\":{\"name\":\"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing\",\"volume\":\"174 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2002-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.2002.1030026\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2002.1030026","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

在前馈神经网络(FNN)的训练输入中注入噪声可以显著提高其泛化性能。报道的工作证明了这一事实,认为噪声注入相当于平滑正则化,输入噪声方差扮演正则化参数的角色。这种方法的成功取决于输入噪声方差的适当选择。然而,如果施加在FNN映射上的平滑程度与要逼近的未知函数一致,通常是不知道先验的。为了更好地控制这种平滑效应,提出了一种平衡噪声注入引起的平滑拟合和逼近精度的代价函数。第二项的目的是惩罚输入噪声注入的不良影响或控制随机扰动代价的偏差,通过表示原始代价函数与其随机扰动函数之间的一定距离来获得。事实上,这一项可以推导出一般的参数。满足Lipschitz性质的模型。通过一个例子来说明在使用噪声注入的情况下,使用所提出的代价函数进行学习的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
On learning feedforward neural networks with noise injection into inputs
Injecting noise to the inputs during the training of feedforward neural networks (FNN) can improve their generalization performance remarkably. Reported works justify this fact arguing that noise injection is equivalent to a smoothing regularization with the input noise variance playing the role of the regularization parameter. The success of this approach depends on the appropriate choice of the input noise variance. However, it is often not known a priori if the degree of smoothness imposed on the FNN mapping is consistent with the unknown function to be approximated. In order to have a better control over this smoothing effect, a cost function putting in balance the smoothed fitting induced by the noise injection and the precision of approximation, is proposed. The second term, which aims at penalizing the undesirable effect of input noise injection or controlling the deviation of the random perturbed cost, was obtained by expressing a certain distance between the original cost function and its random perturbed version. In fact, this term can be derived in general for parametrical. models that satisfy the Lipschitz property. An example is included to illustrate the effectiveness of learning with this proposed cost function when noise injection is used.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Fusion of multiple experts in multimodal biometric personal identity verification systems A new SOLPN-based rate control algorithm for MPEG video coding Analog implementation for networks of integrate-and-fire neurons with adaptive local connectivity Removal of residual crosstalk components in blind source separation using LMS filters Functional connectivity modelling in fMRI based on causal networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1