全面研究了反向传播算法及其修正

A. Sidani, T. Sidani
{"title":"全面研究了反向传播算法及其修正","authors":"A. Sidani, T. Sidani","doi":"10.1109/SOUTHC.1994.498919","DOIUrl":null,"url":null,"abstract":"Many connectionist/neural network learning systems use some derivative of the popular backpropagation (BP) algorithm. BP learning, however, is too slow for many applications. In addition, it scales poorly as tasks become larger and more complex. As a result, researchers in the field have come up with variations and modifications to the original BP learning technique that address the aforementioned issues. This research was conducted to collect a representative sample of BP modifications and compare them against one another. The benchmarks utilized are certain \"toy-problems\" that have been extensively used in the literature. A software package that allows one to experiment with a multitude of BP variations was developed to achieve the desired goal. The modifications are evaluated and cross examined for each task tested. The package provides the means for parameter optimization and allows a user to build hybrid algorithms based on the different functionalities and features of the various modifications.","PeriodicalId":164672,"journal":{"name":"Conference Record Southcon","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"A comprehensive study of the backpropagation algorithm and modifications\",\"authors\":\"A. Sidani, T. Sidani\",\"doi\":\"10.1109/SOUTHC.1994.498919\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many connectionist/neural network learning systems use some derivative of the popular backpropagation (BP) algorithm. BP learning, however, is too slow for many applications. In addition, it scales poorly as tasks become larger and more complex. As a result, researchers in the field have come up with variations and modifications to the original BP learning technique that address the aforementioned issues. This research was conducted to collect a representative sample of BP modifications and compare them against one another. The benchmarks utilized are certain \\\"toy-problems\\\" that have been extensively used in the literature. A software package that allows one to experiment with a multitude of BP variations was developed to achieve the desired goal. The modifications are evaluated and cross examined for each task tested. The package provides the means for parameter optimization and allows a user to build hybrid algorithms based on the different functionalities and features of the various modifications.\",\"PeriodicalId\":164672,\"journal\":{\"name\":\"Conference Record Southcon\",\"volume\":\"61 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-03-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Conference Record Southcon\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SOUTHC.1994.498919\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference Record Southcon","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SOUTHC.1994.498919","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

许多连接主义/神经网络学习系统使用流行的反向传播(BP)算法的一些导数。然而,BP学习对于许多应用来说太慢了。此外,当任务变得更大、更复杂时,它的可扩展性很差。因此,该领域的研究人员提出了对原始BP学习技术的变化和修改,以解决上述问题。本研究收集了具有代表性的BP修饰样本,并对它们进行了比较。所使用的基准是在文献中广泛使用的某些“玩具问题”。为了实现预期的目标,开发了一个软件包,允许人们对多种BP变化进行实验。对每个测试任务的修改进行评估和交叉检查。该软件包提供了参数优化的方法,并允许用户基于各种修改的不同功能和特性构建混合算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A comprehensive study of the backpropagation algorithm and modifications
Many connectionist/neural network learning systems use some derivative of the popular backpropagation (BP) algorithm. BP learning, however, is too slow for many applications. In addition, it scales poorly as tasks become larger and more complex. As a result, researchers in the field have come up with variations and modifications to the original BP learning technique that address the aforementioned issues. This research was conducted to collect a representative sample of BP modifications and compare them against one another. The benchmarks utilized are certain "toy-problems" that have been extensively used in the literature. A software package that allows one to experiment with a multitude of BP variations was developed to achieve the desired goal. The modifications are evaluated and cross examined for each task tested. The package provides the means for parameter optimization and allows a user to build hybrid algorithms based on the different functionalities and features of the various modifications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Real-time eye feature tracking from a video image sequence using Kalman filter Learning situational knowledge through observation of expert performance in a simulation-based environment Electro-optical sensor packaging overview Range extension for electric vehicles Very large scale integrated circuit architecture performance evaluation using SES modelling tools
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1