{"title":"Handwritten alpha-numeric recognition by a self-growing neural network 'CombNET-II'","authors":"A. Iwata, Y. Suwa, Y. Ino, N. Suzumura","doi":"10.1109/IJCNN.1992.227337","DOIUrl":null,"url":null,"abstract":"CombNET-II is a self-growing four-layer neural network model which has a comb structure. The first layer constitutes a stem network which quantizes an input feature vector space into several subspaces and the following 2-4 layers constitute branch network modules which classify input data in each sub-space into specified categories. CombNET-II uses a self-growing neural network learning procedure, for training the stem network. Back propagation is utilized to train branch networks. Each branch module, which is a three-layer hierarchical network, has a restricted number of output neurons and inter-connections so that it is easy to train. Therefore CombNET-II does not cause the local minimum state since the complexities of the problems to be solved for each branch module are restricted by the stem network. CombNET-II correctly classified 99.0% of previously unseen handwritten alpha-numeric characters.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.227337","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
CombNET-II is a self-growing four-layer neural network model which has a comb structure. The first layer constitutes a stem network which quantizes an input feature vector space into several subspaces and the following 2-4 layers constitute branch network modules which classify input data in each sub-space into specified categories. CombNET-II uses a self-growing neural network learning procedure, for training the stem network. Back propagation is utilized to train branch networks. Each branch module, which is a three-layer hierarchical network, has a restricted number of output neurons and inter-connections so that it is easy to train. Therefore CombNET-II does not cause the local minimum state since the complexities of the problems to be solved for each branch module are restricted by the stem network. CombNET-II correctly classified 99.0% of previously unseen handwritten alpha-numeric characters.<>