{"title":"Shrinkage estimators of BLUE for time series regression models","authors":"Yujie Xue , Masanobu Taniguchi , Tong Liu","doi":"10.1016/j.jmva.2023.105282","DOIUrl":null,"url":null,"abstract":"<div><p><span>The least squares estimator<span><span> (LSE) seems a natural estimator of linear regression models. Whereas, if the dimension of the vector of regression coefficients is greater than 1 and the residuals are dependent, the best </span>linear unbiased estimator<span> (BLUE), which includes the information of the covariance matrix </span></span></span><span><math><mi>Γ</mi></math></span><span><span> of residual process has a better performance than LSE in the sense of mean square error. As we know the </span>unbiased estimators<span> are generally inadmissible, Senda and Taniguchi (2006) introduced a James–Stein type shrinkage estimator for the regression coefficients based on LSE, where the residual process is a Gaussian stationary process, and provides sufficient conditions such that the James–Stein type shrinkage estimator improves LSE. In this paper, we propose a shrinkage estimator based on BLUE. Sufficient conditions for this shrinkage estimator to improve BLUE are also given. Furthermore, since </span></span><span><math><mi>Γ</mi></math></span> is infeasible, assuming that <span><math><mi>Γ</mi></math></span> has a form of <span><math><mrow><mi>Γ</mi><mo>=</mo><mi>Γ</mi><mrow><mo>(</mo><mi>θ</mi><mo>)</mo></mrow></mrow></math></span>, we introduce a feasible version of that shrinkage estimator with replacing <span><math><mrow><mi>Γ</mi><mrow><mo>(</mo><mi>θ</mi><mo>)</mo></mrow></mrow></math></span> by <span><math><mrow><mi>Γ</mi><mrow><mo>(</mo><mover><mrow><mi>θ</mi></mrow><mrow><mo>ˆ</mo></mrow></mover><mo>)</mo></mrow></mrow></math></span><span> which is introduced in Toyooka (1986). Additionally, we give the sufficient conditions where the feasible version improves BLUE. Besides, the results of a numerical studies confirm our approach.</span></p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"202 ","pages":"Article 105282"},"PeriodicalIF":1.4000,"publicationDate":"2023-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X23001288","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
The least squares estimator (LSE) seems a natural estimator of linear regression models. Whereas, if the dimension of the vector of regression coefficients is greater than 1 and the residuals are dependent, the best linear unbiased estimator (BLUE), which includes the information of the covariance matrix of residual process has a better performance than LSE in the sense of mean square error. As we know the unbiased estimators are generally inadmissible, Senda and Taniguchi (2006) introduced a James–Stein type shrinkage estimator for the regression coefficients based on LSE, where the residual process is a Gaussian stationary process, and provides sufficient conditions such that the James–Stein type shrinkage estimator improves LSE. In this paper, we propose a shrinkage estimator based on BLUE. Sufficient conditions for this shrinkage estimator to improve BLUE are also given. Furthermore, since is infeasible, assuming that has a form of , we introduce a feasible version of that shrinkage estimator with replacing by which is introduced in Toyooka (1986). Additionally, we give the sufficient conditions where the feasible version improves BLUE. Besides, the results of a numerical studies confirm our approach.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.