Guided convergence for training feed-forward neural network using novel gravitational search optimization

S. Saha, Dwaipayan Chakraborty, Oindrilla Dutta
{"title":"Guided convergence for training feed-forward neural network using novel gravitational search optimization","authors":"S. Saha, Dwaipayan Chakraborty, Oindrilla Dutta","doi":"10.1109/ICHPCA.2014.7045348","DOIUrl":null,"url":null,"abstract":"Training of feed-forward neural network using stochastic optimization techniques recently gained a lot of importance invarious pattern recognition and data mining applications because of its capability of escaping local minima trap. However such techniques may suffer from slow and poor convergence. This fact inspires us to work on meta-heuristic optimization technique for training the neural network. In this respect, to train the neural network, we focus on implementing the gravitational search algorithm(GSA) which is based on the Newton's law of motion principle and the interaction of masses. GSA has good ability to search for the global optimum, but it may suffer from slow searching speed in the last iterations. Our work is directed towards the smart convergence by modifying the original GSA and also guiding the algorithm to make it immune to local minima trap. Results on various benchmark datasets prove the robustness of the modified algorithm.","PeriodicalId":197528,"journal":{"name":"2014 International Conference on High Performance Computing and Applications (ICHPCA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 International Conference on High Performance Computing and Applications (ICHPCA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICHPCA.2014.7045348","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Training of feed-forward neural network using stochastic optimization techniques recently gained a lot of importance invarious pattern recognition and data mining applications because of its capability of escaping local minima trap. However such techniques may suffer from slow and poor convergence. This fact inspires us to work on meta-heuristic optimization technique for training the neural network. In this respect, to train the neural network, we focus on implementing the gravitational search algorithm(GSA) which is based on the Newton's law of motion principle and the interaction of masses. GSA has good ability to search for the global optimum, but it may suffer from slow searching speed in the last iterations. Our work is directed towards the smart convergence by modifying the original GSA and also guiding the algorithm to make it immune to local minima trap. Results on various benchmark datasets prove the robustness of the modified algorithm.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于新型引力搜索优化的前馈神经网络导引收敛训练
利用随机优化技术训练前馈神经网络,由于其能够避免局部极小陷阱,近年来在模式识别和数据挖掘等领域得到了广泛的应用。然而,这种技术可能遭受缓慢和较差的收敛。这一事实激励我们研究用于训练神经网络的元启发式优化技术。在这方面,为了训练神经网络,我们重点实现了基于牛顿运动定律原理和质量相互作用的引力搜索算法(GSA)。GSA具有良好的全局最优搜索能力,但在最后的迭代中可能存在搜索速度慢的问题。我们的工作是通过修改原始的GSA来实现智能收敛,并指导算法使其不受局部最小陷阱的影响。在各种基准数据集上的实验结果证明了改进算法的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
DE-FPA: A hybrid differential evolution-flower pollination algorithm for function minimization Ultra-thin Si directly on insulator (SDOI) MOSFETs at 20 nm gate length Secured packet inspection with hierarchical pattern matching implemented using incremental clustering algorithm Lifting biorthogonal wavelet design for edge detection Test case prioritization techniques “an empirical study”
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1