Statistical Mechanics of On-line Ensemble Teacher Learning through a Novel Perceptron Learning Rule

K. Hara, S. Miyoshi
{"title":"Statistical Mechanics of On-line Ensemble Teacher Learning through a Novel Perceptron Learning Rule","authors":"K. Hara, S. Miyoshi","doi":"10.1143/JPSJ.81.064002","DOIUrl":null,"url":null,"abstract":"In ensemble teacher learning, ensemble teachers have only uncertain information about the true teacher, and this information is given by an ensemble consisting of an infinite number of ensemble teachers whose variety is sufficiently rich. In this learning, a student learns from an ensemble teacher that is iteratively selected randomly from a pool of many ensemble teachers. An interesting point of ensemble teacher learning is the asymptotic behavior of the student to approach the true teacher by learning from ensemble teachers. The student performance is improved by using the Hebbian learning rule in the learning. However, the perceptron learning rule cannot improve the student performance. On the other hand, we proposed a perceptron learning rule with a margin. This learning rule is identical to the perceptron learning rule when the margin is zero and identical to the Hebbian learning rule when the margin is infinity. Thus, this rule connects the perceptron learning rule and the Hebbian learning rule continuously through the size of the margin. Using this rule, we study changes in the learning behavior from the perceptron learning rule to the Hebbian learning rule by considering several margin sizes. From the results, we show that by setting a margin of kappa > 0, the effect of an ensemble appears and becomes significant when a larger margin kappa is used.","PeriodicalId":8438,"journal":{"name":"arXiv: Disordered Systems and Neural Networks","volume":"22 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2016-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1143/JPSJ.81.064002","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In ensemble teacher learning, ensemble teachers have only uncertain information about the true teacher, and this information is given by an ensemble consisting of an infinite number of ensemble teachers whose variety is sufficiently rich. In this learning, a student learns from an ensemble teacher that is iteratively selected randomly from a pool of many ensemble teachers. An interesting point of ensemble teacher learning is the asymptotic behavior of the student to approach the true teacher by learning from ensemble teachers. The student performance is improved by using the Hebbian learning rule in the learning. However, the perceptron learning rule cannot improve the student performance. On the other hand, we proposed a perceptron learning rule with a margin. This learning rule is identical to the perceptron learning rule when the margin is zero and identical to the Hebbian learning rule when the margin is infinity. Thus, this rule connects the perceptron learning rule and the Hebbian learning rule continuously through the size of the margin. Using this rule, we study changes in the learning behavior from the perceptron learning rule to the Hebbian learning rule by considering several margin sizes. From the results, we show that by setting a margin of kappa > 0, the effect of an ensemble appears and becomes significant when a larger margin kappa is used.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于新型感知器学习规则的在线集成教师学习的统计力学
在合群教师学习中,合群教师对真正的教师只有不确定的信息,这些信息是由无限数量的合群教师组成的合群提供的,这些合群教师的多样性足够丰富。在这种学习中,学生从一个集合老师那里学习,这个集合老师是从许多集合老师中迭代随机选择的。集体教师学习的一个有趣点是学生通过向集体教师学习而接近真正的教师的渐近行为。在学习中运用Hebbian学习规则,提高了学生的学习成绩。然而,感知器学习规则并不能提高学生的成绩。另一方面,我们提出了一个带边际的感知器学习规则。这个学习规则和感知器的学习规则是一样的当边缘为零和Hebbian的学习规则是一样的当边缘为无穷大。因此,该规则通过边距的大小将感知器学习规则和Hebbian学习规则连续连接起来。利用该规则,我们研究了从感知器学习规则到Hebbian学习规则的学习行为的变化,并考虑了几种边缘大小。结果表明,当kappa的余量为0 0 0时,当kappa的余量较大时,整体效应就会显现出来。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Multifractality and Fock-space localization in many-body localized states: One-particle density matrix perspective Thouless Energy Across Many-Body Localization Transition in Floquet Systems. Curvature-driven ac-assisted creep dynamics of magnetic domain walls Duality between two generalized Aubry-André models with exact mobility edges Relationship between two-level systems and quasi-localized normal modes in glasses
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1