{"title":"RBF网络的L/sub 1/收敛速率和核回归估计的一些结果","authors":"A. Krzyżak, L. Xu","doi":"10.1109/ICNN.1994.374356","DOIUrl":null,"url":null,"abstract":"Rather than studying the L/sub 2/ convergence rates of kernel regression estimators (KRE) and radial basis function (RBF) nets given in Xu-Krzyzak-Yuille (1992 & 1993), we study convergence properties of the mean integrated absolute error (MIAE) for KRE and RBF nets. It has been shown that MIAE of KRE and RBF nets can converge to zero as the size of networks and the size of the training sequence tend to /spl infin/, and that the upper bound for the convergence rate of MIAE is O(n-/sup /spl alpha/s/sub (2+s)/( /sub 2//spl alpha/+d)/) for approximating Lipschitz functions.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Some results on L/sub 1/ convergence rate of RBF networks and kernel regression estimators\",\"authors\":\"A. Krzyżak, L. Xu\",\"doi\":\"10.1109/ICNN.1994.374356\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Rather than studying the L/sub 2/ convergence rates of kernel regression estimators (KRE) and radial basis function (RBF) nets given in Xu-Krzyzak-Yuille (1992 & 1993), we study convergence properties of the mean integrated absolute error (MIAE) for KRE and RBF nets. It has been shown that MIAE of KRE and RBF nets can converge to zero as the size of networks and the size of the training sequence tend to /spl infin/, and that the upper bound for the convergence rate of MIAE is O(n-/sup /spl alpha/s/sub (2+s)/( /sub 2//spl alpha/+d)/) for approximating Lipschitz functions.<<ETX>>\",\"PeriodicalId\":209128,\"journal\":{\"name\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"volume\":\"25 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-06-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNN.1994.374356\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374356","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Some results on L/sub 1/ convergence rate of RBF networks and kernel regression estimators
Rather than studying the L/sub 2/ convergence rates of kernel regression estimators (KRE) and radial basis function (RBF) nets given in Xu-Krzyzak-Yuille (1992 & 1993), we study convergence properties of the mean integrated absolute error (MIAE) for KRE and RBF nets. It has been shown that MIAE of KRE and RBF nets can converge to zero as the size of networks and the size of the training sequence tend to /spl infin/, and that the upper bound for the convergence rate of MIAE is O(n-/sup /spl alpha/s/sub (2+s)/( /sub 2//spl alpha/+d)/) for approximating Lipschitz functions.<>