Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors

Miya Nakajima, Rikuto Mochida, Yuya Takada, Tsuyoshi Kato
{"title":"Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors","authors":"Miya Nakajima, Rikuto Mochida, Yuya Takada, Tsuyoshi Kato","doi":"10.5121/csit.2023.131801","DOIUrl":null,"url":null,"abstract":"Sign constraints are a handy representation of domain-specific prior knowledge that can be incorporated to machine learning. Under the sign constraints, the signs of the weight coefficients for linear predictors cannot be flipped from the ones specified in advance according to the prior knowledge. This paper presents new stochastic dual coordinate ascent (SDCA) algorithms that find the minimizer of the empirical risk under the sign constraints. Generic surrogate loss functions can be plugged into the proposed algorithm with the strong convergence guarantee inherited from the vanilla SDCA. A technical contribution of this work is the finding of an efficient algorithm that performs the SDCA update with a cost linear to the number of input features which coincides with the SDCA update without the sign constraints. Eventually, the computational cost O(nd) is achieved to attain an ϵ-accuracy solution. Pattern recognition experiments were carried out using a classification task for microbiological water quality analysis. The experimental results demonstrate the powerful prediction performance of the sign constraints.","PeriodicalId":91205,"journal":{"name":"Artificial intelligence and applications (Commerce, Calif.)","volume":"33 4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial intelligence and applications (Commerce, Calif.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2023.131801","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Sign constraints are a handy representation of domain-specific prior knowledge that can be incorporated to machine learning. Under the sign constraints, the signs of the weight coefficients for linear predictors cannot be flipped from the ones specified in advance according to the prior knowledge. This paper presents new stochastic dual coordinate ascent (SDCA) algorithms that find the minimizer of the empirical risk under the sign constraints. Generic surrogate loss functions can be plugged into the proposed algorithm with the strong convergence guarantee inherited from the vanilla SDCA. A technical contribution of this work is the finding of an efficient algorithm that performs the SDCA update with a cost linear to the number of input features which coincides with the SDCA update without the sign constraints. Eventually, the computational cost O(nd) is achieved to attain an ϵ-accuracy solution. Pattern recognition experiments were carried out using a classification task for microbiological water quality analysis. The experimental results demonstrate the powerful prediction performance of the sign constraints.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
学习符号约束线性预测器的随机双坐标上升
符号约束是一种方便的特定领域先验知识的表示,可以合并到机器学习中。在符号约束下,线性预测器的权重系数的符号不能从事先根据先验知识指定的符号中翻转过来。提出了一种新的随机双坐标上升(SDCA)算法,该算法在符号约束下求经验风险的最小值。该算法继承了传统SDCA算法的强收敛性保证,并引入了通用代理损失函数。这项工作的一个技术贡献是发现了一种有效的算法,该算法以与输入特征数量线性的代价执行SDCA更新,该特征与没有符号约束的SDCA更新相一致。最终,计算成本为0 (nd),得到ϵ-accuracy解。利用分类任务对微生物水质分析进行模式识别实验。实验结果证明了符号约束的强大预测性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Methodology of Measurement Intellectualization based on Regularized Bayesian Approach in Uncertain Conditions Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors Data Smoothing Filling Method based on ScRNA-Seq Data Zero-Value Identification Batch-Stochastic Sub-Gradient Method for Solving Non-Smooth Convex Loss Function Problems Teaching Reading Skills More Effectively
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1