Guojian Cheng, Tianshi Liu, Kuisheng Wang, Jiaxin Han
{"title":"Soft Competitive Learning and Growing Self-Organizing Neural Networks for Pattern Classification","authors":"Guojian Cheng, Tianshi Liu, Kuisheng Wang, Jiaxin Han","doi":"10.1109/SYNASC.2006.68","DOIUrl":null,"url":null,"abstract":"Competitive learning can be defined as an adaptive process in which the neurons in an artificial neural network gradually become sensitive to different input categories which are sets of patterns in a specific domain of the input space. By using competitive learning, Kohonen's self-organizing maps (KSOM) can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of KSOM are formation of topology preserving feature maps and approximation of input probability distribution. However, KSOM have some shortages, e.g., a fixed number of neural units and a fixed topology dimensionality which can result in problems if this dimensionality does not match the dimensionality of the feature manifold. Compared to KSOM, growing self-organizing neural networks (GSONN) can change their topological structures during learning. The topology formation of both GSONN and KSOM is driven by soft competitive learning. This paper first gives an introduction to KSOM and neural gas network. Then, we discuss some GSONN without fixed dimensionality such as growing neural gas and the author's model: twin growing neural gas and it's application for pattern classification. It is ended with some conclusions","PeriodicalId":309740,"journal":{"name":"2006 Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2006-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SYNASC.2006.68","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Competitive learning can be defined as an adaptive process in which the neurons in an artificial neural network gradually become sensitive to different input categories which are sets of patterns in a specific domain of the input space. By using competitive learning, Kohonen's self-organizing maps (KSOM) can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of KSOM are formation of topology preserving feature maps and approximation of input probability distribution. However, KSOM have some shortages, e.g., a fixed number of neural units and a fixed topology dimensionality which can result in problems if this dimensionality does not match the dimensionality of the feature manifold. Compared to KSOM, growing self-organizing neural networks (GSONN) can change their topological structures during learning. The topology formation of both GSONN and KSOM is driven by soft competitive learning. This paper first gives an introduction to KSOM and neural gas network. Then, we discuss some GSONN without fixed dimensionality such as growing neural gas and the author's model: twin growing neural gas and it's application for pattern classification. It is ended with some conclusions