{"title":"基于新型感知器学习规则的在线集成教师学习的统计力学","authors":"K. Hara, S. Miyoshi","doi":"10.1143/JPSJ.81.064002","DOIUrl":null,"url":null,"abstract":"In ensemble teacher learning, ensemble teachers have only uncertain information about the true teacher, and this information is given by an ensemble consisting of an infinite number of ensemble teachers whose variety is sufficiently rich. In this learning, a student learns from an ensemble teacher that is iteratively selected randomly from a pool of many ensemble teachers. An interesting point of ensemble teacher learning is the asymptotic behavior of the student to approach the true teacher by learning from ensemble teachers. The student performance is improved by using the Hebbian learning rule in the learning. However, the perceptron learning rule cannot improve the student performance. On the other hand, we proposed a perceptron learning rule with a margin. This learning rule is identical to the perceptron learning rule when the margin is zero and identical to the Hebbian learning rule when the margin is infinity. Thus, this rule connects the perceptron learning rule and the Hebbian learning rule continuously through the size of the margin. Using this rule, we study changes in the learning behavior from the perceptron learning rule to the Hebbian learning rule by considering several margin sizes. From the results, we show that by setting a margin of kappa > 0, the effect of an ensemble appears and becomes significant when a larger margin kappa is used.","PeriodicalId":8438,"journal":{"name":"arXiv: Disordered Systems and Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2016-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Statistical Mechanics of On-line Ensemble Teacher Learning through a Novel Perceptron Learning Rule\",\"authors\":\"K. Hara, S. Miyoshi\",\"doi\":\"10.1143/JPSJ.81.064002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In ensemble teacher learning, ensemble teachers have only uncertain information about the true teacher, and this information is given by an ensemble consisting of an infinite number of ensemble teachers whose variety is sufficiently rich. In this learning, a student learns from an ensemble teacher that is iteratively selected randomly from a pool of many ensemble teachers. An interesting point of ensemble teacher learning is the asymptotic behavior of the student to approach the true teacher by learning from ensemble teachers. The student performance is improved by using the Hebbian learning rule in the learning. However, the perceptron learning rule cannot improve the student performance. On the other hand, we proposed a perceptron learning rule with a margin. This learning rule is identical to the perceptron learning rule when the margin is zero and identical to the Hebbian learning rule when the margin is infinity. Thus, this rule connects the perceptron learning rule and the Hebbian learning rule continuously through the size of the margin. Using this rule, we study changes in the learning behavior from the perceptron learning rule to the Hebbian learning rule by considering several margin sizes. From the results, we show that by setting a margin of kappa > 0, the effect of an ensemble appears and becomes significant when a larger margin kappa is used.\",\"PeriodicalId\":8438,\"journal\":{\"name\":\"arXiv: Disordered Systems and Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv: Disordered Systems and Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1143/JPSJ.81.064002\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1143/JPSJ.81.064002","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Statistical Mechanics of On-line Ensemble Teacher Learning through a Novel Perceptron Learning Rule
In ensemble teacher learning, ensemble teachers have only uncertain information about the true teacher, and this information is given by an ensemble consisting of an infinite number of ensemble teachers whose variety is sufficiently rich. In this learning, a student learns from an ensemble teacher that is iteratively selected randomly from a pool of many ensemble teachers. An interesting point of ensemble teacher learning is the asymptotic behavior of the student to approach the true teacher by learning from ensemble teachers. The student performance is improved by using the Hebbian learning rule in the learning. However, the perceptron learning rule cannot improve the student performance. On the other hand, we proposed a perceptron learning rule with a margin. This learning rule is identical to the perceptron learning rule when the margin is zero and identical to the Hebbian learning rule when the margin is infinity. Thus, this rule connects the perceptron learning rule and the Hebbian learning rule continuously through the size of the margin. Using this rule, we study changes in the learning behavior from the perceptron learning rule to the Hebbian learning rule by considering several margin sizes. From the results, we show that by setting a margin of kappa > 0, the effect of an ensemble appears and becomes significant when a larger margin kappa is used.