Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202795
Xinquan Zhao, Lun Zhou, X. Liao
In this paper, local asymptotic stability and global asymptotic stability of the steady state solutions of Hopfield neural networks with reaction-diffusion terms are investigated. Under the L/sub 2/ norm, applying the differential inequality some sufficiency criterions for local exponential stability and global exponential stability of the steady state solution of system are established.
{"title":"Exponential stability of the steady state solution of Hopfield neural networks with reaction-diffusion terms under the L/sub 2/ norm","authors":"Xinquan Zhao, Lun Zhou, X. Liao","doi":"10.1109/ICONIP.2002.1202795","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202795","url":null,"abstract":"In this paper, local asymptotic stability and global asymptotic stability of the steady state solutions of Hopfield neural networks with reaction-diffusion terms are investigated. Under the L/sub 2/ norm, applying the differential inequality some sufficiency criterions for local exponential stability and global exponential stability of the steady state solution of system are established.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127255074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202802
J.M. Garcia, S. Lozano, K. Smith, T. Kwok, G. Villa
This paper deals with the problem of selecting and scheduling a set of orders to be manufactured and immediately delivered to the customer site. We provide m plants for production and V vehicles for distribution. Furthermore, another constraints to be considered are the limited production capacity at plants and time windows within which orders must be served. A genetic algorithm to solve the problem is developed and tested empirically with randomly generated problems. In order to benchmark the GA, a graph-based exact method is proposed. However, such exact method is not efficient and, therefore, can only be used for small problems. Results attest that our GA produces good-quality solutions.
{"title":"Coordinated scheduling of production and delivery from multiple plants and with time windows using genetic algorithms","authors":"J.M. Garcia, S. Lozano, K. Smith, T. Kwok, G. Villa","doi":"10.1109/ICONIP.2002.1202802","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202802","url":null,"abstract":"This paper deals with the problem of selecting and scheduling a set of orders to be manufactured and immediately delivered to the customer site. We provide m plants for production and V vehicles for distribution. Furthermore, another constraints to be considered are the limited production capacity at plants and time windows within which orders must be served. A genetic algorithm to solve the problem is developed and tested empirically with randomly generated problems. In order to benchmark the GA, a graph-based exact method is proposed. However, such exact method is not efficient and, therefore, can only be used for small problems. Results attest that our GA produces good-quality solutions.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127374062","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202820
L. Liao, Ka Kit Cheung
A neural network model is proposed in this paper for the discrete-time optimal control problem with control constraints. A neural network model is established based on the projection method. Theoretical analysis for the convergence and stability of the neural network model is provided.
{"title":"A neural network model for discrete-time optimal control with control constraints","authors":"L. Liao, Ka Kit Cheung","doi":"10.1109/ICONIP.2002.1202820","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202820","url":null,"abstract":"A neural network model is proposed in this paper for the discrete-time optimal control problem with control constraints. A neural network model is established based on the projection method. Theoretical analysis for the convergence and stability of the neural network model is provided.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126117105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202893
A. Sánchez-Jiménez, V.M. Garcia, A. Perez-de Vargas, F. Panetsos
The oscillatory behaviour of both action potentials and subthreshold membrane potential, is widely accepted to be the basis of neuronal signal. An important factor of this behaviour is the frequency to which the neuron is able to oscillate. Little is known about the factors that affect this frequency or if it is modulated by some neuronal mechanism. In the present work we studied the relation between the frequency of the oscillations that display the inferior olive neural cells the passive currents, calcium-active conductances, as well as with the reversal potential of passive channels. We prove that the oscillation frequency of the inferior olive neurons can be determined by means of these parameters.
{"title":"Membrane dynamics and single-neuron signal processing","authors":"A. Sánchez-Jiménez, V.M. Garcia, A. Perez-de Vargas, F. Panetsos","doi":"10.1109/ICONIP.2002.1202893","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202893","url":null,"abstract":"The oscillatory behaviour of both action potentials and subthreshold membrane potential, is widely accepted to be the basis of neuronal signal. An important factor of this behaviour is the frequency to which the neuron is able to oscillate. Little is known about the factors that affect this frequency or if it is modulated by some neuronal mechanism. In the present work we studied the relation between the frequency of the oscillations that display the inferior olive neural cells the passive currents, calcium-active conductances, as well as with the reversal potential of passive channels. We prove that the oscillation frequency of the inferior olive neurons can be determined by means of these parameters.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123453295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1198117
Y. Hirokawa, S. Abe
In this paper, we propose a new method for training support vector regressors. In our method, we partition all the variables into two sets: a working set that consists of more than two variables and a set in which variables are fixed. Then we optimize the variables in the working set using the steepest ascent method. If the Hessian matrix associated with the working set is not positive definite, we calculate corrections only for the independent variable in the working set. We test our method by two benchmark data sets, and show that by increasing the working set size, we can speed up training of support vector regressors.
{"title":"Training of support vector regressors based on the steepest ascent method","authors":"Y. Hirokawa, S. Abe","doi":"10.1109/ICONIP.2002.1198117","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198117","url":null,"abstract":"In this paper, we propose a new method for training support vector regressors. In our method, we partition all the variables into two sets: a working set that consists of more than two variables and a set in which variables are fixed. Then we optimize the variables in the working set using the steepest ascent method. If the Hessian matrix associated with the working set is not positive definite, we calculate corrections only for the independent variable in the working set. We test our method by two benchmark data sets, and show that by increasing the working set size, we can speed up training of support vector regressors.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123703195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1198151
J. Klecková, J. Krutisová, V. Matousek, J. Schwarz
For languages, especially for Czech language featured by a free-word-ordering, the prosody serves a critical information for the recognition and understanding system. For some sentences the speaker's style is essential to determine the core of the communication, depending on a speaker who thus emphasises a meaning of the sentence. This paper describes the first results of speaker's style determination. The experiments show that the speech recognition quality is increased by the style determination by using prosody characteristics.
{"title":"Important prosody characteristics for spontaneous speech recognition","authors":"J. Klecková, J. Krutisová, V. Matousek, J. Schwarz","doi":"10.1109/ICONIP.2002.1198151","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198151","url":null,"abstract":"For languages, especially for Czech language featured by a free-word-ordering, the prosody serves a critical information for the recognition and understanding system. For some sentences the speaker's style is essential to determine the core of the communication, depending on a speaker who thus emphasises a meaning of the sentence. This paper describes the first results of speaker's style determination. The experiments show that the speech recognition quality is increased by the style determination by using prosody characteristics.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123793074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1199029
S. Usui
The NRV (Neuroinformatics Research in Vision) project is the first project in Japan started in 1999 under the Strategic Promotion System for Brain Science of the Special Coordination Funds for Promoting Science and Technology (SCF) at the Science and Technology Agency (now under the MEXT: Ministry of Education, Culture, Sports, Science and Technology), aimed at building the foundation of neuroinformatics research. Because of the wealth of data on the visual system, the NRV project will use vision research to promote experimental, theoretical and technical research in neuroinformatics. Details can be found at: http://www.neuroinformatics.gr.jp/.
{"title":"Neuroinformatics in vision science: NRV project and visiome environment","authors":"S. Usui","doi":"10.1109/ICONIP.2002.1199029","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1199029","url":null,"abstract":"The NRV (Neuroinformatics Research in Vision) project is the first project in Japan started in 1999 under the Strategic Promotion System for Brain Science of the Special Coordination Funds for Promoting Science and Technology (SCF) at the Science and Technology Agency (now under the MEXT: Ministry of Education, Culture, Sports, Science and Technology), aimed at building the foundation of neuroinformatics research. Because of the wealth of data on the visual system, the NRV project will use vision research to promote experimental, theoretical and technical research in neuroinformatics. Details can be found at: http://www.neuroinformatics.gr.jp/.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123801994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202873
S. Sugiyama
We have much information to be processed in order to get various systems. For getting a desired output, we have now many kinds of AI methods and techniques, which may be able to handle an input information intelligently. However, those methods and techniques are not so flexible enough to get a proper output to an input when an output is not a desired one. That is to say, when the present methods have once produced an output, it will be the final one and it cannot be changed whatever an output's expectation is. In this situation the systems need to have some kind of dynamic knowledge base behavior in treatment, which gives more proper an output to an input. So in this paper, the following themes are discussed: 1) communication method among processes in a system, 2) general mechanism of dynamic behavior of AI and knowledge base, and 3) system structure.
{"title":"Artificial intelligence behavior in dynamic knowledge base","authors":"S. Sugiyama","doi":"10.1109/ICONIP.2002.1202873","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202873","url":null,"abstract":"We have much information to be processed in order to get various systems. For getting a desired output, we have now many kinds of AI methods and techniques, which may be able to handle an input information intelligently. However, those methods and techniques are not so flexible enough to get a proper output to an input when an output is not a desired one. That is to say, when the present methods have once produced an output, it will be the final one and it cannot be changed whatever an output's expectation is. In this situation the systems need to have some kind of dynamic knowledge base behavior in treatment, which gives more proper an output to an input. So in this paper, the following themes are discussed: 1) communication method among processes in a system, 2) general mechanism of dynamic behavior of AI and knowledge base, and 3) system structure.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125570207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1202213
J. Mitrpanont, A. Srisuphab
The paper presents the approach of the quantum complex-valued backpropagation neural network or QCBPN. The challenge of our research is the expected results from the development of the quantum neural network using complex-valued backpropagation learning algorithm to solve classification problems. The concept of QCBPN emerged from the quantum circuit neural network research and the complex-valued backpropagation algorithm. We found that complex value and the quantum states share some natural representation suitable for the parallel computation. The quantum circuit neural network provides a qubit-like neuron model based on quantum mechanics with quantum backpropagation-learning rule, while the complex-valued backpropagation algorithm modifies standard backpropagation algorithm to learn complex number pattern in a natural way. The quantum complex-valued neuron model and the QCBPN learning algorithm are described. Finally, the realization of the QCBPN is exploited with a simple pattern recognition problem.
{"title":"The realization of quantum complex-valued backpropagation neural network in pattern recognition problem","authors":"J. Mitrpanont, A. Srisuphab","doi":"10.1109/ICONIP.2002.1202213","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1202213","url":null,"abstract":"The paper presents the approach of the quantum complex-valued backpropagation neural network or QCBPN. The challenge of our research is the expected results from the development of the quantum neural network using complex-valued backpropagation learning algorithm to solve classification problems. The concept of QCBPN emerged from the quantum circuit neural network research and the complex-valued backpropagation algorithm. We found that complex value and the quantum states share some natural representation suitable for the parallel computation. The quantum circuit neural network provides a qubit-like neuron model based on quantum mechanics with quantum backpropagation-learning rule, while the complex-valued backpropagation algorithm modifies standard backpropagation algorithm to learn complex number pattern in a natural way. The quantum complex-valued neuron model and the QCBPN learning algorithm are described. Finally, the realization of the QCBPN is exploited with a simple pattern recognition problem.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115070383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-11-18DOI: 10.1109/ICONIP.2002.1198198
Jong-Seok Lee, C. Park
In this paper, we propose a new kind of neural network having modular structure, neural network with adaptive neurons. Each module is equivalent to an adaptive neuron, which consists of a multi-layer neural network with sigmoid neurons. We develop an algorithm by which the network can automatically adjust its complexity according to the given problem. The proposed network is compared with the project pursuit learning network (PPLN), which is a popular modular architecture. The experimental results demonstrate that the proposed network architecture outperforms the PPLN on four regression problems.
{"title":"Self-organizing neural networks using adaptive neurons","authors":"Jong-Seok Lee, C. Park","doi":"10.1109/ICONIP.2002.1198198","DOIUrl":"https://doi.org/10.1109/ICONIP.2002.1198198","url":null,"abstract":"In this paper, we propose a new kind of neural network having modular structure, neural network with adaptive neurons. Each module is equivalent to an adaptive neuron, which consists of a multi-layer neural network with sigmoid neurons. We develop an algorithm by which the network can automatically adjust its complexity according to the given problem. The proposed network is compared with the project pursuit learning network (PPLN), which is a popular modular architecture. The experimental results demonstrate that the proposed network architecture outperforms the PPLN on four regression problems.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"141 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122440628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}