A majorization penalty method for SVM with sparse constraint

Si-Tong Lu, Qingna Li
{"title":"A majorization penalty method for SVM with sparse constraint","authors":"Si-Tong Lu, Qingna Li","doi":"10.1080/10556788.2022.2142584","DOIUrl":null,"url":null,"abstract":"Support vector machine (SVM) is an important and fundamental technique in machine learning. Soft-margin SVM models have stronger generalization performance compared with the hard-margin SVM. Most existing works use the hinge-loss function which can be regarded as an upper bound of the 0–1 loss function. However, it cannot explicitly control the number of misclassified samples. In this paper, we use the idea of soft-margin SVM and propose a new SVM model with a sparse constraint. Our model can strictly limit the number of misclassified samples, expressing the soft-margin constraint as a sparse constraint. By constructing a majorization function, a majorization penalty method can be used to solve the sparse-constrained optimization problem. We apply Conjugate-Gradient (CG) method to solve the resulting subproblem. Extensive numerical results demonstrate the impressive performance of the proposed majorization penalty method.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"57 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods and Software","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10556788.2022.2142584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Support vector machine (SVM) is an important and fundamental technique in machine learning. Soft-margin SVM models have stronger generalization performance compared with the hard-margin SVM. Most existing works use the hinge-loss function which can be regarded as an upper bound of the 0–1 loss function. However, it cannot explicitly control the number of misclassified samples. In this paper, we use the idea of soft-margin SVM and propose a new SVM model with a sparse constraint. Our model can strictly limit the number of misclassified samples, expressing the soft-margin constraint as a sparse constraint. By constructing a majorization function, a majorization penalty method can be used to solve the sparse-constrained optimization problem. We apply Conjugate-Gradient (CG) method to solve the resulting subproblem. Extensive numerical results demonstrate the impressive performance of the proposed majorization penalty method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于稀疏约束的支持向量机的多数化惩罚方法
支持向量机(SVM)是机器学习中一项重要的基础技术。与硬边缘支持向量机模型相比,软边缘支持向量机模型具有更强的泛化性能。现有的研究大多采用铰链损失函数,它可以看作是0-1损失函数的上界。然而,它不能明确地控制误分类样本的数量。本文利用软边界支持向量机的思想,提出了一种新的带有稀疏约束的支持向量机模型。我们的模型可以严格限制误分类样本的数量,将软边界约束表示为稀疏约束。通过构造多数化函数,采用多数化惩罚法求解稀疏约束优化问题。我们应用共轭梯度(CG)方法来求解由此产生的子问题。大量的数值结果表明,所提出的多数化惩罚方法具有令人印象深刻的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Maximizing the number of rides served for time-limited Dial-a-Ride* A family of limited memory three term conjugate gradient methods A semismooth conjugate gradients method – theoretical analysis A mixed-integer programming formulation for optimizing the double row layout problem Robust reverse 1-center problems on trees with interval costs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1