{"title":"一种新的MNN训练方法与实证研究","authors":"Jiasen Wang, Pan Wang","doi":"10.1109/GCIS.2012.35","DOIUrl":null,"url":null,"abstract":"Based on the thought of “to be expert in one aspect and good at many”, a new training method of modular neural network (MNN) is presented. The key point of this method is a subnet learns the neighbor data sets while fulfiling its main task : learning the objective data set. Both methodology and empirical study of this new method are presented. Two examples (static approximation and nonlinear dynamic system prediction) are tested to show the new method's effectiveness: average testing error is dramatically decreased compared to original algorithm..","PeriodicalId":337629,"journal":{"name":"2012 Third Global Congress on Intelligent Systems","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"A New MNN's Training Method with Empirical Study\",\"authors\":\"Jiasen Wang, Pan Wang\",\"doi\":\"10.1109/GCIS.2012.35\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Based on the thought of “to be expert in one aspect and good at many”, a new training method of modular neural network (MNN) is presented. The key point of this method is a subnet learns the neighbor data sets while fulfiling its main task : learning the objective data set. Both methodology and empirical study of this new method are presented. Two examples (static approximation and nonlinear dynamic system prediction) are tested to show the new method's effectiveness: average testing error is dramatically decreased compared to original algorithm..\",\"PeriodicalId\":337629,\"journal\":{\"name\":\"2012 Third Global Congress on Intelligent Systems\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 Third Global Congress on Intelligent Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GCIS.2012.35\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 Third Global Congress on Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GCIS.2012.35","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Based on the thought of “to be expert in one aspect and good at many”, a new training method of modular neural network (MNN) is presented. The key point of this method is a subnet learns the neighbor data sets while fulfiling its main task : learning the objective data set. Both methodology and empirical study of this new method are presented. Two examples (static approximation and nonlinear dynamic system prediction) are tested to show the new method's effectiveness: average testing error is dramatically decreased compared to original algorithm..