{"title":"优化SVM超参数","authors":"Huang Dongyuan, Chen Xiaoyun","doi":"10.1109/CINC.2010.5643857","DOIUrl":null,"url":null,"abstract":"Choosing optimal hyperparameters for Support Vector Machines(SVMs) is quite difficult but extremely essential in SVM design. This is usually done by minimizing estimates of generalization error such as the k-fold cross-validation error or the upper bound of leave-one-out(LOO) error. However, most of the approaches concentrate on the dual optimization problem of SVM. In this paper, we would like to consider the task of tuning hyperparameters in the primal. We derive a smooth validation function from the k-fold cross-validation, then tune hyperparameters by minimizing the smooth validation function using Quasi- Newton optimization technique. Experimental results not only show that our approach is much faster and provides more precise results than grid search method, but also demonstrate that tuning hyperparameters in the primal would be more efficient than in the dual due to advantages provided by the primal.","PeriodicalId":227004,"journal":{"name":"2010 Second International Conference on Computational Intelligence and Natural Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Tuning SVM hyperparameters in the primal\",\"authors\":\"Huang Dongyuan, Chen Xiaoyun\",\"doi\":\"10.1109/CINC.2010.5643857\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Choosing optimal hyperparameters for Support Vector Machines(SVMs) is quite difficult but extremely essential in SVM design. This is usually done by minimizing estimates of generalization error such as the k-fold cross-validation error or the upper bound of leave-one-out(LOO) error. However, most of the approaches concentrate on the dual optimization problem of SVM. In this paper, we would like to consider the task of tuning hyperparameters in the primal. We derive a smooth validation function from the k-fold cross-validation, then tune hyperparameters by minimizing the smooth validation function using Quasi- Newton optimization technique. Experimental results not only show that our approach is much faster and provides more precise results than grid search method, but also demonstrate that tuning hyperparameters in the primal would be more efficient than in the dual due to advantages provided by the primal.\",\"PeriodicalId\":227004,\"journal\":{\"name\":\"2010 Second International Conference on Computational Intelligence and Natural Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 Second International Conference on Computational Intelligence and Natural Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CINC.2010.5643857\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 Second International Conference on Computational Intelligence and Natural Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CINC.2010.5643857","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

选择支持向量机的最优超参数是支持向量机设计中一个非常困难但又至关重要的问题。这通常是通过最小化泛化误差的估计来完成的,比如k倍交叉验证误差或留一误差的上界。然而,大多数方法都集中在支持向量机的对偶优化问题上。在这篇论文中,我们想要考虑在原始模型中调整超参数的任务。我们从k-fold交叉验证中推导出一个平滑验证函数,然后利用准牛顿优化技术通过最小化平滑验证函数来调整超参数。实验结果表明,我们的方法比网格搜索方法更快,提供更精确的结果,而且由于原始算法提供的优势,在原始算法中调优超参数比在对偶算法中调优更有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Tuning SVM hyperparameters in the primal
Choosing optimal hyperparameters for Support Vector Machines(SVMs) is quite difficult but extremely essential in SVM design. This is usually done by minimizing estimates of generalization error such as the k-fold cross-validation error or the upper bound of leave-one-out(LOO) error. However, most of the approaches concentrate on the dual optimization problem of SVM. In this paper, we would like to consider the task of tuning hyperparameters in the primal. We derive a smooth validation function from the k-fold cross-validation, then tune hyperparameters by minimizing the smooth validation function using Quasi- Newton optimization technique. Experimental results not only show that our approach is much faster and provides more precise results than grid search method, but also demonstrate that tuning hyperparameters in the primal would be more efficient than in the dual due to advantages provided by the primal.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Evolutionary design of ANN structure using genetic algorithm Performance analysis of spread spectrum communication system in fading enviornment and Interference Comprehensive evaluation of forest industries based on rough sets and artificial neural network A new descent algorithm with curve search rule for unconstrained minimization A multi-agent simulation for intelligence economy
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1