Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.226894
A. Sanchez, G. Hirzinger
The problem of grasping a generic sphere is addressed. A supervised learning approach using a multilayer neural network for learning the position in 3D space and the radius of the sphere is introduced. Learning is based on laser range finder measurements of the surface of spheres of known radii at known positions. The problem is first formulated. An analytical solution for a set of four laser range finders and a solution based on supervised learning are then given and compared. Experimental results showing the feasibility and novelty of the approach are reported.<>
{"title":"Learning how to grasp under supervision","authors":"A. Sanchez, G. Hirzinger","doi":"10.1109/IJCNN.1992.226894","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.226894","url":null,"abstract":"The problem of grasping a generic sphere is addressed. A supervised learning approach using a multilayer neural network for learning the position in 3D space and the radius of the sphere is introduced. Learning is based on laser range finder measurements of the surface of spheres of known radii at known positions. The problem is first formulated. An analytical solution for a set of four laser range finders and a solution based on supervised learning are then given and compared. Experimental results showing the feasibility and novelty of the approach are reported.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131340584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.226905
J. Nie, D. Linkens
Viewing the given rule-base as defining a global linguistic association constrained by fuzzy sets, approximate reasoning is implemented by a backpropagation neural network (BNN) with the aid of the fuzzy set theory. The underlying principles are examined in detail using two examples, paying particular attention to the capability of generalization of the BNN. The simulation results indicate the feasibility of the BNN-based approach. It is demonstrated that a forward-chaining fuzzy reasoning system with parallel rule-bases can be implemented within the framework of neural networks. The studies into the BNN-based fuzzy controller suggest that, besides a seeming resemblance between rules and patterns in the logic-based and BNN-based approaches, there exists a deeper similarity in the information processing aspect in them, namely, fuzziness vs. distributiveness.<>
{"title":"Fuzzy reasoning implemented by neural networks","authors":"J. Nie, D. Linkens","doi":"10.1109/IJCNN.1992.226905","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.226905","url":null,"abstract":"Viewing the given rule-base as defining a global linguistic association constrained by fuzzy sets, approximate reasoning is implemented by a backpropagation neural network (BNN) with the aid of the fuzzy set theory. The underlying principles are examined in detail using two examples, paying particular attention to the capability of generalization of the BNN. The simulation results indicate the feasibility of the BNN-based approach. It is demonstrated that a forward-chaining fuzzy reasoning system with parallel rule-bases can be implemented within the framework of neural networks. The studies into the BNN-based fuzzy controller suggest that, besides a seeming resemblance between rules and patterns in the logic-based and BNN-based approaches, there exists a deeper similarity in the information processing aspect in them, namely, fuzziness vs. distributiveness.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"348 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131990533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.227276
S. Aityan
The author introduces and describes a recurrent neural field that is made up of refractory neurons. Every neutron of the field is chaotically connected to others with stochastically distributed bonds and shows the stochastical time varying threshold. The paper takes the location of the neuron in the space-time-firing state of the neuron field into account. The term neural field is used to describe a single layer intraconnected neural network. The input and output signals are distributed between clusters of closely connected neurons. A simple example shows good logical abilities of the refractory neural field. A three neuron refractory field is able to solve the exclusive OR problem.<>
{"title":"Recurrent refractory neural field","authors":"S. Aityan","doi":"10.1109/IJCNN.1992.227276","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.227276","url":null,"abstract":"The author introduces and describes a recurrent neural field that is made up of refractory neurons. Every neutron of the field is chaotically connected to others with stochastically distributed bonds and shows the stochastical time varying threshold. The paper takes the location of the neuron in the space-time-firing state of the neuron field into account. The term neural field is used to describe a single layer intraconnected neural network. The input and output signals are distributed between clusters of closely connected neurons. A simple example shows good logical abilities of the refractory neural field. A three neuron refractory field is able to solve the exclusive OR problem.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134072568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.287211
Itsuki Noda, Makoto Nagao
A novel network model and a learning algorithm based on symbol processing theory are described. The algorithm is derived from the minimization method of finite automata under the correspondence between Elman networks and finite automata. An attempt was made to learn context-free grammars by the new model network. Even though this learning method was derived under the correspondence to finite automata, the network can learn the subgrammar, which is the important feature for distinguishing context-free grammars and finite state automata.<>
{"title":"A learning method for recurrent networks based on minimization of finite automata","authors":"Itsuki Noda, Makoto Nagao","doi":"10.1109/IJCNN.1992.287211","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.287211","url":null,"abstract":"A novel network model and a learning algorithm based on symbol processing theory are described. The algorithm is derived from the minimization method of finite automata under the correspondence between Elman networks and finite automata. An attempt was made to learn context-free grammars by the new model network. Even though this learning method was derived under the correspondence to finite automata, the network can learn the subgrammar, which is the important feature for distinguishing context-free grammars and finite state automata.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134145810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.226911
M. Walker, L. Akers
Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation.<>
{"title":"Information-theoretic analysis of finite register effects in neural networks","authors":"M. Walker, L. Akers","doi":"10.1109/IJCNN.1992.226911","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.226911","url":null,"abstract":"Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134364711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.227186
D. Dong, J. Hopfield
Two kinds of dynamic processes take place in neural networks. One involves the change with time of the activity of each neuron. The other involves the change in the strength of the connections (synapses) between neurons. When a neural network is learning or developing, both processes take place, and their dynamics interact. A theoretical framework is developed to help understand the combined activity and synapse dynamics for a class of adaptive networks. The methods are illustrated by using them to describe the development of orientation-selective cells in the cat primary visual cortex. Within this model, the column structure of different orientation-selective neurons originates from feedback pathways within an area of the cortex, rather than feedforward pathways between areas.<>
{"title":"Dynamics of interconnection development within visual cortex","authors":"D. Dong, J. Hopfield","doi":"10.1109/IJCNN.1992.227186","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.227186","url":null,"abstract":"Two kinds of dynamic processes take place in neural networks. One involves the change with time of the activity of each neuron. The other involves the change in the strength of the connections (synapses) between neurons. When a neural network is learning or developing, both processes take place, and their dynamics interact. A theoretical framework is developed to help understand the combined activity and synapse dynamics for a class of adaptive networks. The methods are illustrated by using them to describe the development of orientation-selective cells in the cat primary visual cortex. Within this model, the column structure of different orientation-selective neurons originates from feedback pathways within an area of the cortex, rather than feedforward pathways between areas.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"166 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132618480","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.227239
Y. Liu, Y. Lee, H. Chen, G. Sun
A discriminative training algorithm for predictive neural network models is proposed. The algorithm is applied to a speaker independent isolated digit recognition experiment. The recognition error rate is reduced from 2.52% when the classifier is trained with a non-discriminative algorithm to 0.58% when the discriminative algorithm is applied. The increase in classifier discrimination ability is also demonstrated.<>
{"title":"Discriminative training algorithm for predictive neural network models","authors":"Y. Liu, Y. Lee, H. Chen, G. Sun","doi":"10.1109/IJCNN.1992.227239","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.227239","url":null,"abstract":"A discriminative training algorithm for predictive neural network models is proposed. The algorithm is applied to a speaker independent isolated digit recognition experiment. The recognition error rate is reduced from 2.52% when the classifier is trained with a non-discriminative algorithm to 0.58% when the discriminative algorithm is applied. The increase in classifier discrimination ability is also demonstrated.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131083763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.226914
S. Bhandari, K. Murthy, S.S.S.P. Rao
The authors report on a VLSI design for the realization of a Boltzmann machine. It is shown analytically and by the use of simulation results that the acceptance probability function for the proposed design is the same as that of the Boltzmann machine. Using such a design, it is possible to build special purpose hardware for obtaining good solutions for several optimization problems. The acceptance probability characteristics of the Boltzmann machine are analyzed.<>
{"title":"A VLSI implementable design for Boltzmann machine","authors":"S. Bhandari, K. Murthy, S.S.S.P. Rao","doi":"10.1109/IJCNN.1992.226914","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.226914","url":null,"abstract":"The authors report on a VLSI design for the realization of a Boltzmann machine. It is shown analytically and by the use of simulation results that the acceptance probability function for the proposed design is the same as that of the Boltzmann machine. Using such a design, it is possible to build special purpose hardware for obtaining good solutions for several optimization problems. The acceptance probability characteristics of the Boltzmann machine are analyzed.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128871845","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.226968
D.P. Morgan, L. Riek, W.J. Mistretta, C. Scofield, P. Grouin, F. Hull
Automatic language identification is a computer-based process to determine the language of speakers while they are speaking. The authors have studied many types of conventional speech processing features and parametric and non-parametric models for classification. They have experimented with the restricted Coloumb energy (RCE) network and found that histograms of the RCE cell activations provide a good metric to identify languages. Results compare favourably to experiments reported by other researchers which have been duplicated. These experiments are discussed, and the databases which are currently being used in this research are described.<>
{"title":"Experiments in language identification with neural networks","authors":"D.P. Morgan, L. Riek, W.J. Mistretta, C. Scofield, P. Grouin, F. Hull","doi":"10.1109/IJCNN.1992.226968","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.226968","url":null,"abstract":"Automatic language identification is a computer-based process to determine the language of speakers while they are speaking. The authors have studied many types of conventional speech processing features and parametric and non-parametric models for classification. They have experimented with the restricted Coloumb energy (RCE) network and found that histograms of the RCE cell activations provide a good metric to identify languages. Results compare favourably to experiments reported by other researchers which have been duplicated. These experiments are discussed, and the databases which are currently being used in this research are described.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128901834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1992-06-07DOI: 10.1109/IJCNN.1992.227088
V. Lorquet
The effects of the adjustment of the threshold of the hidden cells during learning in a one-hidden-layer backpropagation network with half-distributed coding of inputs are analyzed. The fundamentals of this coding method are reviewed. Although it can be applied to both inputs and outputs of the network, only the case of the inputs is considered. The effects of the modification of the thresholds during learning are analyzed. It is shown that these effects become more favorable as the task to be achieved becomes less complex. The correctness of the theoretical model was tested with a real-world application. The network has to approximate a function to realize a numerical model of a physical phenomenon.<>
{"title":"Half-distributed coding makes adaptation of sigmoid-threshold useless in back-propagation networks","authors":"V. Lorquet","doi":"10.1109/IJCNN.1992.227088","DOIUrl":"https://doi.org/10.1109/IJCNN.1992.227088","url":null,"abstract":"The effects of the adjustment of the threshold of the hidden cells during learning in a one-hidden-layer backpropagation network with half-distributed coding of inputs are analyzed. The fundamentals of this coding method are reviewed. Although it can be applied to both inputs and outputs of the network, only the case of the inputs is considered. The effects of the modification of the thresholds during learning are analyzed. It is shown that these effects become more favorable as the task to be achieved becomes less complex. The correctness of the theoretical model was tested with a real-world application. The network has to approximate a function to realize a numerical model of a physical phenomenon.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127855192","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}