{"title":"Hermite regression estimation in noisy convolution model","authors":"Ousmane Sacko","doi":"10.1016/j.jspi.2024.106168","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we consider the following regression model: <span><math><mrow><mi>y</mi><mrow><mo>(</mo><mi>k</mi><mi>T</mi><mo>/</mo><mi>n</mi><mo>)</mo></mrow><mo>=</mo><mi>f</mi><mo>⋆</mo><mi>g</mi><mrow><mo>(</mo><mi>k</mi><mi>T</mi><mo>/</mo><mi>n</mi><mo>)</mo></mrow><mo>+</mo><msub><mrow><mi>ɛ</mi></mrow><mrow><mi>k</mi></mrow></msub><mo>,</mo><mi>k</mi><mo>=</mo><mo>−</mo><mi>n</mi><mo>,</mo><mo>…</mo><mo>,</mo><mi>n</mi><mo>−</mo><mn>1</mn></mrow></math></span>, <span><math><mi>T</mi></math></span> fixed, where <span><math><mi>g</mi></math></span> is known and <span><math><mi>f</mi></math></span> is the unknown function to be estimated. The errors <span><math><msub><mrow><mrow><mo>(</mo><msub><mrow><mi>ɛ</mi></mrow><mrow><mi>k</mi></mrow></msub><mo>)</mo></mrow></mrow><mrow><mo>−</mo><mi>n</mi><mo>≤</mo><mi>k</mi><mo>≤</mo><mi>n</mi><mo>−</mo><mn>1</mn></mrow></msub></math></span> are independent and identically distributed centered with finite known variance. Two adaptive estimation methods for <span><math><mi>f</mi></math></span> are considered by exploiting the properties of the Hermite basis. We study the quadratic risk of each estimator. If <span><math><mi>f</mi></math></span> belongs to Sobolev regularity spaces, we derive rates of convergence. Adaptive procedures to select the relevant parameter inspired by the Goldenshluger and Lepski method are proposed and we prove that the resulting estimators satisfy oracle inequalities for sub-Gaussian <span><math><mi>ɛ</mi></math></span>’s. Finally, we illustrate numerically these approaches.</p></div>","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375824000259","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we consider the following regression model: , fixed, where is known and is the unknown function to be estimated. The errors are independent and identically distributed centered with finite known variance. Two adaptive estimation methods for are considered by exploiting the properties of the Hermite basis. We study the quadratic risk of each estimator. If belongs to Sobolev regularity spaces, we derive rates of convergence. Adaptive procedures to select the relevant parameter inspired by the Goldenshluger and Lepski method are proposed and we prove that the resulting estimators satisfy oracle inequalities for sub-Gaussian ’s. Finally, we illustrate numerically these approaches.
本文考虑以下回归模型:y(kT/n)=f⋆g(kT/n)+ɛk,k=-n,...,n-1, T 固定,其中 g 为已知函数,f 为待估计的未知函数。误差 (ɛk)-n≤k≤n-1 是独立且同分布的中心误差,具有有限的已知方差。利用赫米特基的特性,我们考虑了 f 的两种自适应估计方法。我们研究了每种估计方法的二次风险。如果 f 属于 Sobolev 正则空间,我们将得出收敛率。受 Goldenshluger 和 Lepski 方法的启发,我们提出了选择相关参数的自适应程序,并证明所得到的估计器满足亚高斯ɛ的oracle 不等式。最后,我们用数字说明了这些方法。