{"title":"Persistent Excitation of Improved RBF Neural Networks: Neuron Dynamic-Growing Strategy","authors":"Min Wang;Mingyu Wang;Chenguang Yang","doi":"10.1109/TNNLS.2025.3528118","DOIUrl":null,"url":null,"abstract":"This brief proposes a novel neuron dynamic-growing (NDG) strategy for radial basis function neural networks (RBF NNs). Only one neuron is selected in advance relying on the system initial states, and other neurons are dynamically generated based on the designed threshold for the distance between the current NN input and the closest neuron. Compared with the RBF NN using neuron fixed evenly spaced strategy (NFES), the improved RBF NN has two major advantages: one is to extremely reduce the number of neurons, especially for the high dimensional NN inputs; and the other is to provide a theoretical criteria for the choice of NN structure parameters including the neuron center and the compact set size. To guarantee the dynamic learning ability of the improved RBF NN, the persistent excitation (PE) is verified strictly by subtly constructing the threshold and the center of newly added neurons. Simulation and experimental results illustrate that the improved RBF NN integrated into the existing dynamic learning control effectively enhances the transient control performance, reduces the computational burden, and saves data storage space.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 8","pages":"15553-15560"},"PeriodicalIF":8.9000,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10879145/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This brief proposes a novel neuron dynamic-growing (NDG) strategy for radial basis function neural networks (RBF NNs). Only one neuron is selected in advance relying on the system initial states, and other neurons are dynamically generated based on the designed threshold for the distance between the current NN input and the closest neuron. Compared with the RBF NN using neuron fixed evenly spaced strategy (NFES), the improved RBF NN has two major advantages: one is to extremely reduce the number of neurons, especially for the high dimensional NN inputs; and the other is to provide a theoretical criteria for the choice of NN structure parameters including the neuron center and the compact set size. To guarantee the dynamic learning ability of the improved RBF NN, the persistent excitation (PE) is verified strictly by subtly constructing the threshold and the center of newly added neurons. Simulation and experimental results illustrate that the improved RBF NN integrated into the existing dynamic learning control effectively enhances the transient control performance, reduces the computational burden, and saves data storage space.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.