求解非光滑凸损失函数问题的批量随机子梯度法

KasimuJuma Ahmed
{"title":"求解非光滑凸损失函数问题的批量随机子梯度法","authors":"KasimuJuma Ahmed","doi":"10.5121/csit.2023.131806","DOIUrl":null,"url":null,"abstract":"Mean Absolute Error (MAE) and Mean Square Error (MSE) are machine learning loss functions that not only estimates the discrepancy between prediction and true label but also guide the optimal parameter of the model.Gradient is used in estimating MSE model and Sub-gradient in estimating MAE model. Batch and stochastic are two of the many variations of sub-gradient method but the former considers the entire dataset per iteration while the latter considers one data point per iteration. Batch-stochastic Sub-gradient method that learn based on the inputted data and gives stable estimated loss value than that of stochastic and memory efficient than that of batch has been developed by considering defined collection of data-point per iteration. The stability and memory efficiency of the method was tested using structured query language (SQL). The new method shows greater stability, accuracy, convergence, memory efficiencyand computational efficiency than any other existing method of finding optimal feasible parameter(s) of a continuous data.","PeriodicalId":91205,"journal":{"name":"Artificial intelligence and applications (Commerce, Calif.)","volume":"101 7-8","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Batch-Stochastic Sub-Gradient Method for Solving Non-Smooth Convex Loss Function Problems\",\"authors\":\"KasimuJuma Ahmed\",\"doi\":\"10.5121/csit.2023.131806\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Mean Absolute Error (MAE) and Mean Square Error (MSE) are machine learning loss functions that not only estimates the discrepancy between prediction and true label but also guide the optimal parameter of the model.Gradient is used in estimating MSE model and Sub-gradient in estimating MAE model. Batch and stochastic are two of the many variations of sub-gradient method but the former considers the entire dataset per iteration while the latter considers one data point per iteration. Batch-stochastic Sub-gradient method that learn based on the inputted data and gives stable estimated loss value than that of stochastic and memory efficient than that of batch has been developed by considering defined collection of data-point per iteration. The stability and memory efficiency of the method was tested using structured query language (SQL). The new method shows greater stability, accuracy, convergence, memory efficiencyand computational efficiency than any other existing method of finding optimal feasible parameter(s) of a continuous data.\",\"PeriodicalId\":91205,\"journal\":{\"name\":\"Artificial intelligence and applications (Commerce, Calif.)\",\"volume\":\"101 7-8\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Artificial intelligence and applications (Commerce, Calif.)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5121/csit.2023.131806\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial intelligence and applications (Commerce, Calif.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2023.131806","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

平均绝对误差(MAE)和均方误差(MSE)是机器学习损失函数,不仅可以估计预测与真实标签之间的差异,还可以指导模型的最优参数。梯度用于估计MSE模型,次梯度用于估计MAE模型。批处理方法和随机方法是亚梯度方法的两种变体,但批处理方法每次迭代考虑整个数据集,而随机方法每次迭代考虑一个数据点。提出了一种基于输入数据进行学习的批量-随机次梯度方法,该方法在每次迭代中考虑定义的数据点集合,其估计损失值比随机方法稳定,并且比批量方法节省内存。采用结构化查询语言(SQL)测试了该方法的稳定性和存储效率。与现有的任何一种寻找连续数据最优可行参数的方法相比,该方法具有更高的稳定性、准确性、收敛性、存储效率和计算效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Batch-Stochastic Sub-Gradient Method for Solving Non-Smooth Convex Loss Function Problems
Mean Absolute Error (MAE) and Mean Square Error (MSE) are machine learning loss functions that not only estimates the discrepancy between prediction and true label but also guide the optimal parameter of the model.Gradient is used in estimating MSE model and Sub-gradient in estimating MAE model. Batch and stochastic are two of the many variations of sub-gradient method but the former considers the entire dataset per iteration while the latter considers one data point per iteration. Batch-stochastic Sub-gradient method that learn based on the inputted data and gives stable estimated loss value than that of stochastic and memory efficient than that of batch has been developed by considering defined collection of data-point per iteration. The stability and memory efficiency of the method was tested using structured query language (SQL). The new method shows greater stability, accuracy, convergence, memory efficiencyand computational efficiency than any other existing method of finding optimal feasible parameter(s) of a continuous data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Methodology of Measurement Intellectualization based on Regularized Bayesian Approach in Uncertain Conditions Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors Data Smoothing Filling Method based on ScRNA-Seq Data Zero-Value Identification Batch-Stochastic Sub-Gradient Method for Solving Non-Smooth Convex Loss Function Problems Teaching Reading Skills More Effectively
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1