基于最小模型选择准则的多元广义岭回归岭参数优化

IF 0.5 4区 数学 Q3 MATHEMATICS Hiroshima Mathematical Journal Pub Date : 2021-07-01 DOI:10.32917/H2020104
M. Ohishi
{"title":"基于最小模型选择准则的多元广义岭回归岭参数优化","authors":"M. Ohishi","doi":"10.32917/H2020104","DOIUrl":null,"url":null,"abstract":"A multivariate generalized ridge (MGR) regression provides a shrinkage estimator of the multivariate linear regression by multiple ridge parameters. Since the ridge parameters which adjust the amount of shrinkage of the estimator are unknown, their optimization is an important task to obtain a better estimator. For the univariate case, a fast algorithm has been proposed for optimizing ridge parameters based on minimizing a model selection criterion (MSC) and the algorithm can be applied to various MSCs. In this paper, we extend this algorithm to MGR regression. We also describe the relationship between the MGR estimator which is not sparse and a multivariate adaptive group Lasso estimator which is sparse, under orthogonal explanatory variables.","PeriodicalId":55054,"journal":{"name":"Hiroshima Mathematical Journal","volume":" ","pages":""},"PeriodicalIF":0.5000,"publicationDate":"2021-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Ridge parameters optimization based on minimizing model selection criterion in multivariate generalized ridge regression\",\"authors\":\"M. Ohishi\",\"doi\":\"10.32917/H2020104\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A multivariate generalized ridge (MGR) regression provides a shrinkage estimator of the multivariate linear regression by multiple ridge parameters. Since the ridge parameters which adjust the amount of shrinkage of the estimator are unknown, their optimization is an important task to obtain a better estimator. For the univariate case, a fast algorithm has been proposed for optimizing ridge parameters based on minimizing a model selection criterion (MSC) and the algorithm can be applied to various MSCs. In this paper, we extend this algorithm to MGR regression. We also describe the relationship between the MGR estimator which is not sparse and a multivariate adaptive group Lasso estimator which is sparse, under orthogonal explanatory variables.\",\"PeriodicalId\":55054,\"journal\":{\"name\":\"Hiroshima Mathematical Journal\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.5000,\"publicationDate\":\"2021-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Hiroshima Mathematical Journal\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.32917/H2020104\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Hiroshima Mathematical Journal","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.32917/H2020104","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 1

摘要

多元广义岭回归(MGR)提供了多元线性回归的缩差估计。由于调节估计器收缩量的脊参数是未知的,因此对脊参数的优化是获得更好估计器的重要任务。针对单变量情况,提出了一种基于最小化模型选择准则(MSC)的山脊参数快速优化算法,该算法可适用于各种模型选择准则。本文将该算法推广到MGR回归中。在正交解释变量下,我们还描述了非稀疏的MGR估计量与稀疏的多元自适应群Lasso估计量之间的关系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Ridge parameters optimization based on minimizing model selection criterion in multivariate generalized ridge regression
A multivariate generalized ridge (MGR) regression provides a shrinkage estimator of the multivariate linear regression by multiple ridge parameters. Since the ridge parameters which adjust the amount of shrinkage of the estimator are unknown, their optimization is an important task to obtain a better estimator. For the univariate case, a fast algorithm has been proposed for optimizing ridge parameters based on minimizing a model selection criterion (MSC) and the algorithm can be applied to various MSCs. In this paper, we extend this algorithm to MGR regression. We also describe the relationship between the MGR estimator which is not sparse and a multivariate adaptive group Lasso estimator which is sparse, under orthogonal explanatory variables.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
0.60
自引率
0.00%
发文量
12
审稿时长
>12 weeks
期刊介绍: Hiroshima Mathematical Journal (HMJ) is a continuation of Journal of Science of the Hiroshima University, Series A, Vol. 1 - 24 (1930 - 1960), and Journal of Science of the Hiroshima University, Series A - I , Vol. 25 - 34 (1961 - 1970). Starting with Volume 4 (1974), each volume of HMJ consists of three numbers annually. This journal publishes original papers in pure and applied mathematics. HMJ is an (electronically) open access journal from Volume 36, Number 1.
期刊最新文献
Hartogs’ analyticity theorem for C2-mappings and maximum principle for q-convex functions On a class of fully nonlinear elliptic equations containing gradient terms on compact almost Hermitian manifolds An ℓ2,0-norm constrained matrix optimization via extended discrete first-order algorithms Generalized solution of the double obstacle problem for Musielak-Orlicz Dirichlet energy integral on metric measure spaces On meromorphic functions sharing three one-point or two-point sets CM
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1