基于LVQ2的地图动态构建神经网络

E. Maillard, B. Solaiman
{"title":"基于LVQ2的地图动态构建神经网络","authors":"E. Maillard, B. Solaiman","doi":"10.1109/ICNN.1994.374274","DOIUrl":null,"url":null,"abstract":"HLVQ network achieves a synthesis of supervised and unsupervised learning. Promising results have been reported elsewhere. A dynamic map-building technique for HLVQ is introduced, During learning, the creation of neurons follows a loose KD-tree algorithm. A criterion for the detection of the network weakness to match the topology of the training set is presented. This information is localized in the input space. When the weakness criterion is matched, a neuron is added to the existing map in a way that preserves the topology of the network. This new algorithm sets the network almost free of a crucial external parameter: the size of the neuron map. Furthermore, it is shown that the network presents highest classification score when employing constant learning rate and neighborhood size.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"A neural network based on LVQ2 with dynamic building of the map\",\"authors\":\"E. Maillard, B. Solaiman\",\"doi\":\"10.1109/ICNN.1994.374274\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"HLVQ network achieves a synthesis of supervised and unsupervised learning. Promising results have been reported elsewhere. A dynamic map-building technique for HLVQ is introduced, During learning, the creation of neurons follows a loose KD-tree algorithm. A criterion for the detection of the network weakness to match the topology of the training set is presented. This information is localized in the input space. When the weakness criterion is matched, a neuron is added to the existing map in a way that preserves the topology of the network. This new algorithm sets the network almost free of a crucial external parameter: the size of the neuron map. Furthermore, it is shown that the network presents highest classification score when employing constant learning rate and neighborhood size.<<ETX>>\",\"PeriodicalId\":209128,\"journal\":{\"name\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"volume\":\"53 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-06-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNN.1994.374274\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374274","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

HLVQ网络实现了监督学习和无监督学习的综合。其他地方也报道了令人鼓舞的结果。在学习过程中,神经元的创建遵循一个松散的kd树算法。提出了一种检测网络弱点以匹配训练集拓扑的准则。该信息在输入空间中被定位。当弱点标准匹配时,以保持网络拓扑结构的方式在现有映射中添加一个神经元。这种新算法使网络几乎不受一个关键外部参数的影响:神经元图的大小。进一步研究表明,当采用恒定的学习率和邻域大小时,网络的分类分数最高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A neural network based on LVQ2 with dynamic building of the map
HLVQ network achieves a synthesis of supervised and unsupervised learning. Promising results have been reported elsewhere. A dynamic map-building technique for HLVQ is introduced, During learning, the creation of neurons follows a loose KD-tree algorithm. A criterion for the detection of the network weakness to match the topology of the training set is presented. This information is localized in the input space. When the weakness criterion is matched, a neuron is added to the existing map in a way that preserves the topology of the network. This new algorithm sets the network almost free of a crucial external parameter: the size of the neuron map. Furthermore, it is shown that the network presents highest classification score when employing constant learning rate and neighborhood size.<>
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A neural network model of the binocular fusion in the human vision Neural network hardware performance criteria Accelerating the training of feedforward neural networks using generalized Hebbian rules for initializing the internal representations Improving generalization performance by information minimization Improvement of speed control performance using PID type neurocontroller in an electric vehicle system
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1