{"title":"基于超定伪逆的压缩神经网络随机权搜索","authors":"M. Manic, B. Wilamowski","doi":"10.1109/ISIE.2003.1267901","DOIUrl":null,"url":null,"abstract":"Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.","PeriodicalId":166431,"journal":{"name":"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)","volume":"28 9","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Random weights search in compressed neural networks using overdetermined pseudoinverse\",\"authors\":\"M. Manic, B. Wilamowski\",\"doi\":\"10.1109/ISIE.2003.1267901\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.\",\"PeriodicalId\":166431,\"journal\":{\"name\":\"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)\",\"volume\":\"28 9\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISIE.2003.1267901\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIE.2003.1267901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Random weights search in compressed neural networks using overdetermined pseudoinverse
Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.