{"title":"A new learning algorithm for feedforward neural networks","authors":"Derong Liu, T. Chang, Yi Zhang","doi":"10.1109/ISIC.2001.971481","DOIUrl":null,"url":null,"abstract":"We develop in the present paper a constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape from the local minimum by using the weight scaling technique. It is only after several consecutive failed attempts in escaping from a local minimum, we will allow the network to grow by adding a hidden layer neuron. At this stage, we employ an optimization procedure based on quadratic/linear programming to select initial weights for the newly added neuron. Our optimization procedure tends to make the network reach the error tolerance with no or little training after adding a hidden layer neuron Our simulation results indicate that the present constructive algorithm can obtain neural networks very close to minimal structures and that convergence (to a solution) in neural network training can be guaranteed. We tested our algorithm extensively using the parity problem.","PeriodicalId":367430,"journal":{"name":"Proceeding of the 2001 IEEE International Symposium on Intelligent Control (ISIC '01) (Cat. No.01CH37206)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceeding of the 2001 IEEE International Symposium on Intelligent Control (ISIC '01) (Cat. No.01CH37206)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIC.2001.971481","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
We develop in the present paper a constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape from the local minimum by using the weight scaling technique. It is only after several consecutive failed attempts in escaping from a local minimum, we will allow the network to grow by adding a hidden layer neuron. At this stage, we employ an optimization procedure based on quadratic/linear programming to select initial weights for the newly added neuron. Our optimization procedure tends to make the network reach the error tolerance with no or little training after adding a hidden layer neuron Our simulation results indicate that the present constructive algorithm can obtain neural networks very close to minimal structures and that convergence (to a solution) in neural network training can be guaranteed. We tested our algorithm extensively using the parity problem.