Tuning of learning rate and momentum on back-propagation

N. Kamiyama, N. Iijima, A. Taguchi, H. Mitsui, Y. Yoshida, M. Sone
{"title":"Tuning of learning rate and momentum on back-propagation","authors":"N. Kamiyama, N. Iijima, A. Taguchi, H. Mitsui, Y. Yoshida, M. Sone","doi":"10.1109/ICCS.1992.254895","DOIUrl":null,"url":null,"abstract":"It is known well that backpropagation is used in recognition and learning on neural networks. The backpropagation, modification of the weight is calculated by learning rate ( eta =0.2) and momentum ( alpha =0.9). The number of training cycles depends on eta and alpha , so that it is necessary to choose the most suitable values for eta and alpha . Then, changing eta and alpha , the authors tried to search for the most suitable values for the learning. As a result, the combination of eta and alpha given the minimum value of the number of training cycles behave under the constant rule. Thus eta =K(1- alpha ). Moreover, the constant K is decided by the ratio between the number of output units and hidden units or the initialized weight.<<ETX>>","PeriodicalId":223769,"journal":{"name":"[Proceedings] Singapore ICCS/ISITA `92","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] Singapore ICCS/ISITA `92","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCS.1992.254895","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

It is known well that backpropagation is used in recognition and learning on neural networks. The backpropagation, modification of the weight is calculated by learning rate ( eta =0.2) and momentum ( alpha =0.9). The number of training cycles depends on eta and alpha , so that it is necessary to choose the most suitable values for eta and alpha . Then, changing eta and alpha , the authors tried to search for the most suitable values for the learning. As a result, the combination of eta and alpha given the minimum value of the number of training cycles behave under the constant rule. Thus eta =K(1- alpha ). Moreover, the constant K is decided by the ratio between the number of output units and hidden units or the initialized weight.<>
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
反向传播中学习率和动量的调整
众所周知,反向传播在神经网络的识别和学习中有着广泛的应用。通过学习率(eta =0.2)和动量(alpha =0.9)计算反向传播,修改权重。训练周期的数量取决于eta和alpha,因此有必要为eta和alpha选择最合适的值。然后,改变eta和alpha,作者试图寻找最适合学习的值。因此,在给定最小训练周期数的情况下,eta和alpha的组合符合常数规则。因此=K(1-)此外,常数K由输出单元数与隐藏单元数或初始化权值之比决定。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Performance analysis of interconnected LANS with server/client configuration An enhanced approach to character recognition by Fourier descriptor A new error control techinque for the parallel combinatory spread spectrum communication system Pattern recognition using optical binary perceptron Fast VQ algorithm using scalar prequantization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1