{"title":"神经网络集成的随机分离学习","authors":"Yong Liu","doi":"10.1109/CISP-BMEI.2017.8302328","DOIUrl":null,"url":null,"abstract":"In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.","PeriodicalId":6474,"journal":{"name":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","volume":"108 1","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Random separation learning for neural network ensembles\",\"authors\":\"Yong Liu\",\"doi\":\"10.1109/CISP-BMEI.2017.8302328\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.\",\"PeriodicalId\":6474,\"journal\":{\"name\":\"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)\",\"volume\":\"108 1\",\"pages\":\"1-4\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISP-BMEI.2017.8302328\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISP-BMEI.2017.8302328","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Random separation learning for neural network ensembles
In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.