Random weights search in compressed neural networks using overdetermined pseudoinverse

M. Manic, B. Wilamowski
{"title":"Random weights search in compressed neural networks using overdetermined pseudoinverse","authors":"M. Manic, B. Wilamowski","doi":"10.1109/ISIE.2003.1267901","DOIUrl":null,"url":null,"abstract":"Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.","PeriodicalId":166431,"journal":{"name":"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)","volume":"28 9","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIE.2003.1267901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于超定伪逆的压缩神经网络随机权搜索
该算法具有硬件实现简单、收敛性强等优点。该算法考虑一种隐层神经网络结构,包括以下几个主要阶段。第一阶段是减少权值集。第二阶段是在压缩后的网络上进行梯度计算。权重搜索只在输入层进行,而输出层总是通过伪反演训练进行训练。采用自适应网络参数对算法进行了进一步改进。最终算法表现出鲁棒性和快速收敛性。实验结果用图表说明。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Systematization of the project of the production system control Random weights search in compressed neural networks using overdetermined pseudoinverse Embedded system to recognize the heat power of a fuel gas and to classificate the quality of alcohol fuel Automatic P/PI speed control for industry servo drives using on-line spectrum analysis of torque command Boost half-bridge ballast with high power factor and auxiliary resonant branch
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1