通过神经因子回归进行条件非参数变量筛选

Jianqing FanPrinceton University, Weining WangUniversity of Groningen, Yue ZhaoUniversity of York
{"title":"通过神经因子回归进行条件非参数变量筛选","authors":"Jianqing FanPrinceton University, Weining WangUniversity of Groningen, Yue ZhaoUniversity of York","doi":"arxiv-2408.10825","DOIUrl":null,"url":null,"abstract":"High-dimensional covariates often admit linear factor structure. To\neffectively screen correlated covariates in high-dimension, we propose a\nconditional variable screening test based on non-parametric regression using\nneural networks due to their representation power. We ask the question whether\nindividual covariates have additional contributions given the latent factors or\nmore generally a set of variables. Our test statistics are based on the\nestimated partial derivative of the regression function of the candidate\nvariable for screening and a observable proxy for the latent factors. Hence,\nour test reveals how much predictors contribute additionally to the\nnon-parametric regression after accounting for the latent factors. Our\nderivative estimator is the convolution of a deep neural network regression\nestimator and a smoothing kernel. We demonstrate that when the neural network\nsize diverges with the sample size, unlike estimating the regression function\nitself, it is necessary to smooth the partial derivative of the neural network\nestimator to recover the desired convergence rate for the derivative. Moreover,\nour screening test achieves asymptotic normality under the null after finely\ncentering our test statistics that makes the biases negligible, as well as\nconsistency for local alternatives under mild conditions. We demonstrate the\nperformance of our test in a simulation study and two real world applications.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"1587 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Conditional nonparametric variable screening by neural factor regression\",\"authors\":\"Jianqing FanPrinceton University, Weining WangUniversity of Groningen, Yue ZhaoUniversity of York\",\"doi\":\"arxiv-2408.10825\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"High-dimensional covariates often admit linear factor structure. To\\neffectively screen correlated covariates in high-dimension, we propose a\\nconditional variable screening test based on non-parametric regression using\\nneural networks due to their representation power. We ask the question whether\\nindividual covariates have additional contributions given the latent factors or\\nmore generally a set of variables. Our test statistics are based on the\\nestimated partial derivative of the regression function of the candidate\\nvariable for screening and a observable proxy for the latent factors. Hence,\\nour test reveals how much predictors contribute additionally to the\\nnon-parametric regression after accounting for the latent factors. Our\\nderivative estimator is the convolution of a deep neural network regression\\nestimator and a smoothing kernel. We demonstrate that when the neural network\\nsize diverges with the sample size, unlike estimating the regression function\\nitself, it is necessary to smooth the partial derivative of the neural network\\nestimator to recover the desired convergence rate for the derivative. Moreover,\\nour screening test achieves asymptotic normality under the null after finely\\ncentering our test statistics that makes the biases negligible, as well as\\nconsistency for local alternatives under mild conditions. We demonstrate the\\nperformance of our test in a simulation study and two real world applications.\",\"PeriodicalId\":501293,\"journal\":{\"name\":\"arXiv - ECON - Econometrics\",\"volume\":\"1587 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - ECON - Econometrics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.10825\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - ECON - Econometrics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.10825","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

高维协变量通常具有线性因子结构。为了有效筛选高维相关协变量,我们提出了一种基于非参数回归的条件变量筛选测试,利用神经网络的表征能力进行筛选。我们提出的问题是,在潜在因素或更广泛的变量集合中,单个协变量是否有额外的贡献。我们的检验统计基于筛选候选变量的回归函数的估计偏导数和潜在因素的可观测替代变量。因此,我们的检验揭示了在考虑潜在因素后,预测因子对非参数回归的额外贡献程度。我们的衍生估计器是深度神经网络回归估计器和平滑核的卷积。我们证明,当神经网络大小随样本大小发散时,与估计回归函数本身不同,有必要平滑神经网络估计器的偏导数,以恢复所需的导数收敛速率。此外,我们的筛选检验在对检验统计量进行精细中心化处理后,实现了空值下的渐近正态性,使偏差可以忽略不计,并在温和条件下实现了局部替代的一致性。我们在一项模拟研究和两个实际应用中证明了我们的测试性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Conditional nonparametric variable screening by neural factor regression
High-dimensional covariates often admit linear factor structure. To effectively screen correlated covariates in high-dimension, we propose a conditional variable screening test based on non-parametric regression using neural networks due to their representation power. We ask the question whether individual covariates have additional contributions given the latent factors or more generally a set of variables. Our test statistics are based on the estimated partial derivative of the regression function of the candidate variable for screening and a observable proxy for the latent factors. Hence, our test reveals how much predictors contribute additionally to the non-parametric regression after accounting for the latent factors. Our derivative estimator is the convolution of a deep neural network regression estimator and a smoothing kernel. We demonstrate that when the neural network size diverges with the sample size, unlike estimating the regression function itself, it is necessary to smooth the partial derivative of the neural network estimator to recover the desired convergence rate for the derivative. Moreover, our screening test achieves asymptotic normality under the null after finely centering our test statistics that makes the biases negligible, as well as consistency for local alternatives under mild conditions. We demonstrate the performance of our test in a simulation study and two real world applications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Simple robust two-stage estimation and inference for generalized impulse responses and multi-horizon causality GPT takes the SAT: Tracing changes in Test Difficulty and Math Performance of Students A Simple and Adaptive Confidence Interval when Nuisance Parameters Satisfy an Inequality Why you should also use OLS estimation of tail exponents On LASSO Inference for High Dimensional Predictive Regression
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1