Jiang Liu, Feng Ni, Mingjun Du, Xuyang Zhang, Zhongli Que, Shihang Song
{"title":"Upper bounds on the node numbers of hidden layers in MLPs","authors":"Jiang Liu, Feng Ni, Mingjun Du, Xuyang Zhang, Zhongli Que, Shihang Song","doi":"10.14311/nnw.2021.31.016","DOIUrl":null,"url":null,"abstract":"It is one of the fundamental and challenging problems to determine the node numbers of hidden layers in neural networks. Various efforts have been made to study the relations between the approximation ability and the number of hidden nodes of some specific neural networks, such as single-hidden-layer and two-hiddenlayer feedforward neural networks with specific or conditional activation functions. However, for arbitrary feedforward neural networks, there are few theoretical results on such issues. This paper gives an upper bound on the node number of each hidden layer for the most general feedforward neural networks called multilayer perceptrons (MLP), from an algebraic point of view. First, we put forward the method of expansion linear spaces to investigate the algebraic structure and properties of the outputs of MLPs. Then it is proved that given k distinct training samples, for any MLP with k nodes in each hidden layer, if a certain optimization problem has solutions, the approximation error keeps invariant with adding nodes to hidden layers. Furthermore, it is shown that for any MLP whose activation function for the output layer is bounded on R, at most k hidden nodes in each hidden layer are needed to learn k training samples.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"1 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/nnw.2021.31.016","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
It is one of the fundamental and challenging problems to determine the node numbers of hidden layers in neural networks. Various efforts have been made to study the relations between the approximation ability and the number of hidden nodes of some specific neural networks, such as single-hidden-layer and two-hiddenlayer feedforward neural networks with specific or conditional activation functions. However, for arbitrary feedforward neural networks, there are few theoretical results on such issues. This paper gives an upper bound on the node number of each hidden layer for the most general feedforward neural networks called multilayer perceptrons (MLP), from an algebraic point of view. First, we put forward the method of expansion linear spaces to investigate the algebraic structure and properties of the outputs of MLPs. Then it is proved that given k distinct training samples, for any MLP with k nodes in each hidden layer, if a certain optimization problem has solutions, the approximation error keeps invariant with adding nodes to hidden layers. Furthermore, it is shown that for any MLP whose activation function for the output layer is bounded on R, at most k hidden nodes in each hidden layer are needed to learn k training samples.
期刊介绍:
Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of:
brain science,
theory and applications of neural networks (both artificial and natural),
fuzzy-neural systems,
methods and applications of evolutionary algorithms,
methods of parallel and mass-parallel computing,
problems of soft-computing,
methods of artificial intelligence.