{"title":"改进概率神经网络的性能","authors":"M. Musavi, K. Kalantri, W. Ahmed","doi":"10.1109/IJCNN.1992.287147","DOIUrl":null,"url":null,"abstract":"A methodology for selection of appropriate widths or covariance matrices of the Gaussian functions in implementations of PNN (probabilistic neural network) classifiers is presented. The Gram-Schmidt orthogonalization process is employed to find these matrices. It has been shown that the proposed technique improves the generalization ability of the PNN classifiers over the standard approach. The result can be applied to other Gaussian-based classifiers such as the radial basis functions.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"181 27","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Improving the performance of probabilistic neural networks\",\"authors\":\"M. Musavi, K. Kalantri, W. Ahmed\",\"doi\":\"10.1109/IJCNN.1992.287147\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A methodology for selection of appropriate widths or covariance matrices of the Gaussian functions in implementations of PNN (probabilistic neural network) classifiers is presented. The Gram-Schmidt orthogonalization process is employed to find these matrices. It has been shown that the proposed technique improves the generalization ability of the PNN classifiers over the standard approach. The result can be applied to other Gaussian-based classifiers such as the radial basis functions.<<ETX>>\",\"PeriodicalId\":286849,\"journal\":{\"name\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"volume\":\"181 27\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1992.287147\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.287147","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving the performance of probabilistic neural networks
A methodology for selection of appropriate widths or covariance matrices of the Gaussian functions in implementations of PNN (probabilistic neural network) classifiers is presented. The Gram-Schmidt orthogonalization process is employed to find these matrices. It has been shown that the proposed technique improves the generalization ability of the PNN classifiers over the standard approach. The result can be applied to other Gaussian-based classifiers such as the radial basis functions.<>