与方差分解模型相关的多变量雅可比多项式回归估计器

IF 0.9 4区 数学 Q3 STATISTICS & PROBABILITY Metrika Pub Date : 2024-02-26 DOI:10.1007/s00184-024-00954-4
{"title":"与方差分解模型相关的多变量雅可比多项式回归估计器","authors":"","doi":"10.1007/s00184-024-00954-4","DOIUrl":null,"url":null,"abstract":"<h3>Abstract</h3> <p>In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of <span> <span>\\(d-\\)</span> </span>variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. <span> <span>\\(d-\\)</span> </span>dimensional random sampling training points follow a <span> <span>\\(d-\\)</span> </span>dimensional Beta distribution. In addition, we provide the reader with an estimate for the <span> <span>\\(L^2-\\)</span> </span>risk error of the estimator. This risk error depends on the <span> <span>\\(L^2-\\)</span> </span>error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.</p>","PeriodicalId":49821,"journal":{"name":"Metrika","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2024-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A multivariate Jacobi polynomials regression estimator associated with an ANOVA decomposition model\",\"authors\":\"\",\"doi\":\"10.1007/s00184-024-00954-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3>Abstract</h3> <p>In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of <span> <span>\\\\(d-\\\\)</span> </span>variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. <span> <span>\\\\(d-\\\\)</span> </span>dimensional random sampling training points follow a <span> <span>\\\\(d-\\\\)</span> </span>dimensional Beta distribution. In addition, we provide the reader with an estimate for the <span> <span>\\\\(L^2-\\\\)</span> </span>risk error of the estimator. This risk error depends on the <span> <span>\\\\(L^2-\\\\)</span> </span>error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.</p>\",\"PeriodicalId\":49821,\"journal\":{\"name\":\"Metrika\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2024-02-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Metrika\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s00184-024-00954-4\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Metrika","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s00184-024-00954-4","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0

摘要

摘要 在这项工作中,我们为解决多维非参数回归问题构建了一个稳定且相当快速的估计器。所提出的估计器基于使用一个新颖而特殊的多变量雅可比多项式系统,该系统可生成一个缩小了的(d-\)变量有限维多项式空间的基础。方差分解技巧被用于构建这个空间。同时,通过使用正定随机矩阵理论中的一些结果,我们证明了在 i.i.d. \(d-\)维随机抽样训练点遵循 \(d-\)维 Beta 分布的条件下,所提出的估计器是稳定的。此外,我们还为读者提供了估计器的\(L^2-\) 风险误差的估计值。这个风险误差取决于回归函数在所考虑的多项式空间上的正交投影误差(\(L^2-\) error)。在回归函数属于给定加权索波列夫空间的条件下,对这种正交投影误差进行了深入研究。得益于对正交投影误差的新估计,我们给出了估计器的最佳收敛率。此外,我们还给出了估计器的正则化扩展版本,该版本能够处理根据未知多变量 pdf 抽取的随机抽样训练向量。此外,我们还推导出了该正则化估计器的经验风险误差上限。最后,我们给出了一些数值模拟,以说明这项工作的各种理论结果。特别是,我们在真实数据上进行了模拟,比较了我们的估计器与现有的一些流行的 NP 回归估计器的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A multivariate Jacobi polynomials regression estimator associated with an ANOVA decomposition model

Abstract

In this work, we construct a stable and fairly fast estimator for solving multidimensional non-parametric regression problems. The proposed estimator is based on the use of a novel and special system of multivariate Jacobi polynomials that generate a basis for a reduced size of \(d-\) variate finite dimensional polynomials space. An ANOVA decomposition trick has been used for building this space. Also, by using some results from the theory of positive definite random matrices, we show that the proposed estimator is stable under the condition that the i.i.d. \(d-\) dimensional random sampling training points follow a \(d-\) dimensional Beta distribution. In addition, we provide the reader with an estimate for the \(L^2-\) risk error of the estimator. This risk error depends on the \(L^2-\) error of the orthogonal projection error of the regression function over the considered polynomials space. An involved study of this orthogonal projection error is done under the condition that the regression function belongs to a given weighted Sobolev space. Thanks to this novel estimate of the orthogonal projection error, we give the optimal convergence rate of our estimator. Furthermore, we give a regularized extension version of our estimator, that is capable of handling random sampling training vectors drawn according to an unknown multivariate pdf. Moreover, we derive an upper bound for the empirical risk error of this regularized estimator. Finally, we give some numerical simulations that illustrate the various theoretical results of this work. In particular, we provide simulations on a real data that compares the performance of our estimator with some existing and popular NP regression estimators.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Metrika
Metrika 数学-统计学与概率论
CiteScore
1.50
自引率
14.30%
发文量
39
审稿时长
6-12 weeks
期刊介绍: Metrika is an international journal for theoretical and applied statistics. Metrika publishes original research papers in the field of mathematical statistics and statistical methods. Great importance is attached to new developments in theoretical statistics, statistical modeling and to actual innovative applicability of the proposed statistical methods and results. Topics of interest include, without being limited to, multivariate analysis, high dimensional statistics and nonparametric statistics; categorical data analysis and latent variable models; reliability, lifetime data analysis and statistics in engineering sciences.
期刊最新文献
Smoothed partially linear varying coefficient quantile regression with nonignorable missing response Two-stage and purely sequential minimum risk point estimation of the scale parameter of a family of distributions under modified LINEX loss plus sampling cost Construction of three-level factorial designs with general minimum lower-order confounding via resolution IV designs Mean test for high-dimensional data based on covariance matrix with linear structures Bounds of expectations of order statistics for distributions possessing monotone reversed failure rates
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1