堆叠式自编码器超参数优化的实验研究

Y. Sun, Bing Xue, Mengjie Zhang, G. Yen
{"title":"堆叠式自编码器超参数优化的实验研究","authors":"Y. Sun, Bing Xue, Mengjie Zhang, G. Yen","doi":"10.1109/CEC.2018.8477921","DOIUrl":null,"url":null,"abstract":"Deep learning algorithms have shown their superiority especially in addressing challenging machine learning tasks. The best performance of deep learning algorithms can be reached only when their hyper-parameters have been successfully optimized. However, the hyper-parameter optimization problem is non-convex and non-differentiable, and traditional optimization algorithms are incapable of addressing them well. Evolutionary algorithms are a class of meta-heuristic search algorithms, preferred for optimizing real-world problems due largely to their no mathematical requirements on the problems to be optimized. Although most researchers from the community of deep learning are aware of the effectiveness of evolutionary algorithms in optimizing the hyper-parameters of deep learning algorithms, they still believe that the grid search method is more effective when the number of hyper-parameters is small. To clarify this, we design a hyper-parameter optimization method by using particle swarm optimization that is a widely used evolutionary algorithm, to perform 192 experimental comparisons for stacked auto-encoders that are a class of deep learning algorithms with a relative small number of hyper-parameters, investigate and compare the classification accuracy and computational complexity with those of the grid search method on eight widely used image classification benchmark datasets. The experimental results show that the proposed algorithm can achieve the comparative classification accuracy but saving 10x-100x computational complexity compared with the grid search method.","PeriodicalId":212677,"journal":{"name":"2018 IEEE Congress on Evolutionary Computation (CEC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"35","resultStr":"{\"title\":\"An Experimental Study on Hyper-parameter Optimization for Stacked Auto-Encoders\",\"authors\":\"Y. Sun, Bing Xue, Mengjie Zhang, G. Yen\",\"doi\":\"10.1109/CEC.2018.8477921\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning algorithms have shown their superiority especially in addressing challenging machine learning tasks. The best performance of deep learning algorithms can be reached only when their hyper-parameters have been successfully optimized. However, the hyper-parameter optimization problem is non-convex and non-differentiable, and traditional optimization algorithms are incapable of addressing them well. Evolutionary algorithms are a class of meta-heuristic search algorithms, preferred for optimizing real-world problems due largely to their no mathematical requirements on the problems to be optimized. Although most researchers from the community of deep learning are aware of the effectiveness of evolutionary algorithms in optimizing the hyper-parameters of deep learning algorithms, they still believe that the grid search method is more effective when the number of hyper-parameters is small. To clarify this, we design a hyper-parameter optimization method by using particle swarm optimization that is a widely used evolutionary algorithm, to perform 192 experimental comparisons for stacked auto-encoders that are a class of deep learning algorithms with a relative small number of hyper-parameters, investigate and compare the classification accuracy and computational complexity with those of the grid search method on eight widely used image classification benchmark datasets. The experimental results show that the proposed algorithm can achieve the comparative classification accuracy but saving 10x-100x computational complexity compared with the grid search method.\",\"PeriodicalId\":212677,\"journal\":{\"name\":\"2018 IEEE Congress on Evolutionary Computation (CEC)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"35\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE Congress on Evolutionary Computation (CEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEC.2018.8477921\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE Congress on Evolutionary Computation (CEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2018.8477921","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 35

摘要

深度学习算法已经显示出其优势,特别是在解决具有挑战性的机器学习任务方面。深度学习算法只有在其超参数被成功优化后才能达到最佳性能。然而,超参数优化问题是非凸不可微的,传统的优化算法无法很好地解决这一问题。进化算法是一类元启发式搜索算法,主要用于优化现实世界的问题,因为它们对要优化的问题没有数学要求。虽然深度学习界的大多数研究者都意识到进化算法在优化深度学习算法超参数方面的有效性,但他们仍然认为网格搜索方法在超参数数量较少时更有效。为了澄清这一点,我们利用广泛使用的进化算法粒子群优化设计了一种超参数优化方法,对超参数相对较少的深度学习算法堆叠自编码器进行了192次实验比较,并在8个广泛使用的图像分类基准数据集上研究了分类精度和计算复杂度与网格搜索方法的比较。实验结果表明,与网格搜索方法相比,该算法在达到比较分类精度的同时,节省了10 -100倍的计算量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
An Experimental Study on Hyper-parameter Optimization for Stacked Auto-Encoders
Deep learning algorithms have shown their superiority especially in addressing challenging machine learning tasks. The best performance of deep learning algorithms can be reached only when their hyper-parameters have been successfully optimized. However, the hyper-parameter optimization problem is non-convex and non-differentiable, and traditional optimization algorithms are incapable of addressing them well. Evolutionary algorithms are a class of meta-heuristic search algorithms, preferred for optimizing real-world problems due largely to their no mathematical requirements on the problems to be optimized. Although most researchers from the community of deep learning are aware of the effectiveness of evolutionary algorithms in optimizing the hyper-parameters of deep learning algorithms, they still believe that the grid search method is more effective when the number of hyper-parameters is small. To clarify this, we design a hyper-parameter optimization method by using particle swarm optimization that is a widely used evolutionary algorithm, to perform 192 experimental comparisons for stacked auto-encoders that are a class of deep learning algorithms with a relative small number of hyper-parameters, investigate and compare the classification accuracy and computational complexity with those of the grid search method on eight widely used image classification benchmark datasets. The experimental results show that the proposed algorithm can achieve the comparative classification accuracy but saving 10x-100x computational complexity compared with the grid search method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Automatic Evolution of AutoEncoders for Compressed Representations Landscape-Based Differential Evolution for Constrained Optimization Problems A Novel Approach for Optimizing Ensemble Components in Rainfall Prediction A Many-Objective Evolutionary Algorithm with Fast Clustering and Reference Point Redistribution Manyobjective Optimization to Design Physical Topology of Optical Networks with Undefined Node Locations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1