Evolving neural networks using ant colony optimization with pheromone trail limits

Michalis Mavrovouniotis, Shengxiang Yang
{"title":"Evolving neural networks using ant colony optimization with pheromone trail limits","authors":"Michalis Mavrovouniotis, Shengxiang Yang","doi":"10.1109/UKCI.2013.6651282","DOIUrl":null,"url":null,"abstract":"The back-propagation (BP) technique is a widely used technique to train artificial neural networks (ANNs). However, BP often gets trapped in a local optimum. Hence, hybrid training was introduced, e.g., a global optimization algorithm with BP, to address this drawback. The key idea of hybrid training is to use global optimization algorithms to provide BP with good initial connection weights. In hybrid training, evolutionary algorithms are widely used, whereas ant colony optimization (ACO) algorithms are rarely used, as the global optimization algorithms. And so far, only the basic ACO algorithm has been used to evolve the connection weights of ANNs. In this paper, we hybridize one of the best performing variations of ACO with BP. The difference of the improved ACO variation from the basic ACO algorithm lies in that pheromone trail limits are imposed to avoid stagnation behaviour. The experimental results show that the proposed training method outperforms other peer training methods.","PeriodicalId":106191,"journal":{"name":"2013 13th UK Workshop on Computational Intelligence (UKCI)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 13th UK Workshop on Computational Intelligence (UKCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UKCI.2013.6651282","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

Abstract

The back-propagation (BP) technique is a widely used technique to train artificial neural networks (ANNs). However, BP often gets trapped in a local optimum. Hence, hybrid training was introduced, e.g., a global optimization algorithm with BP, to address this drawback. The key idea of hybrid training is to use global optimization algorithms to provide BP with good initial connection weights. In hybrid training, evolutionary algorithms are widely used, whereas ant colony optimization (ACO) algorithms are rarely used, as the global optimization algorithms. And so far, only the basic ACO algorithm has been used to evolve the connection weights of ANNs. In this paper, we hybridize one of the best performing variations of ACO with BP. The difference of the improved ACO variation from the basic ACO algorithm lies in that pheromone trail limits are imposed to avoid stagnation behaviour. The experimental results show that the proposed training method outperforms other peer training methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于信息素轨迹限制的蚁群优化进化神经网络
反向传播(BP)技术是一种广泛应用于人工神经网络训练的技术。然而,BP经常陷入局部最优状态。因此,引入了混合训练,例如使用BP的全局优化算法来解决这一缺点。混合训练的关键思想是利用全局优化算法为BP提供良好的初始连接权值。在混合训练中,进化算法被广泛使用,而蚁群优化算法作为全局优化算法很少被使用。到目前为止,只有基本的蚁群算法被用来进化人工神经网络的连接权。本文将一种性能最好的蚁群算法与BP算法进行杂交。改进的蚁群算法与基本蚁群算法的不同之处在于,它施加了信息素轨迹限制以避免停滞行为。实验结果表明,所提出的训练方法优于其他同类训练方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Large-scale optimization: Are co-operative co-evolution and fitness inheritance additive? An evolutionary algorithm for bid-based dynamic economic load dispatch in a deregulated electricity market Comparison of crisp systems and fuzzy systems in agent-based simulation: A case study of soccer penalties Wavelet neural network approach applied to biomechanics of swimming Random projections versus random selection of features for classification of high dimensional data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1