Neural Networks for Partially Linear Quantile Regression

Qixian Zhong, Jane-ling Wang
{"title":"Neural Networks for Partially Linear Quantile Regression","authors":"Qixian Zhong, Jane-ling Wang","doi":"10.1080/07350015.2023.2208183","DOIUrl":null,"url":null,"abstract":"Deep learning has enjoyed tremendous success in a variety of applications but its application to quantile regressions remains scarce. A major advantage of the deep learning approach is its flexibility to model complex data in a more parsimonious way than nonparametric smoothing methods. However, while deep learning brought breakthroughs in prediction, it often lacks interpretability due to the black-box nature of multilayer structure with millions of parameters, hence it is not well suited for statistical inference. In this paper, we leverage the advantages of deep learning to apply it to quantile regression where the goal to produce interpretable results and perform statistical inference. We achieve this by adopting a semiparametric approach based on the partially linear quantile regression model, where covariates of primary interest for statistical inference are modelled linearly and all other covariates are modelled nonparametrically by means of a deep neural network. In addition to the new methodology, we provide theoretical justification for the proposed model by establishing the root-$n$ consistency and asymptotically normality of the parametric coefficient estimator and the minimax optimal convergence rate of the neural nonparametric function estimator. Across several simulated and real data examples, our proposed model empirically produces superior estimates and more accurate predictions than various alternative approaches.","PeriodicalId":118766,"journal":{"name":"Journal of Business & Economic Statistics","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Business & Economic Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/07350015.2023.2208183","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Deep learning has enjoyed tremendous success in a variety of applications but its application to quantile regressions remains scarce. A major advantage of the deep learning approach is its flexibility to model complex data in a more parsimonious way than nonparametric smoothing methods. However, while deep learning brought breakthroughs in prediction, it often lacks interpretability due to the black-box nature of multilayer structure with millions of parameters, hence it is not well suited for statistical inference. In this paper, we leverage the advantages of deep learning to apply it to quantile regression where the goal to produce interpretable results and perform statistical inference. We achieve this by adopting a semiparametric approach based on the partially linear quantile regression model, where covariates of primary interest for statistical inference are modelled linearly and all other covariates are modelled nonparametrically by means of a deep neural network. In addition to the new methodology, we provide theoretical justification for the proposed model by establishing the root-$n$ consistency and asymptotically normality of the parametric coefficient estimator and the minimax optimal convergence rate of the neural nonparametric function estimator. Across several simulated and real data examples, our proposed model empirically produces superior estimates and more accurate predictions than various alternative approaches.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
部分线性分位数回归的神经网络
深度学习在各种应用中取得了巨大的成功,但它在分位数回归中的应用仍然很少。与非参数平滑方法相比,深度学习方法的一个主要优点是它可以灵活地以更简洁的方式对复杂数据进行建模。然而,虽然深度学习在预测方面取得了突破,但由于具有数百万参数的多层结构的黑箱性质,它往往缺乏可解释性,因此不太适合统计推断。在本文中,我们利用深度学习的优势将其应用于分位数回归,其目标是产生可解释的结果并执行统计推断。我们通过采用基于部分线性分位数回归模型的半参数方法来实现这一点,其中统计推断主要感兴趣的协变量是线性建模的,所有其他协变量都是通过深度神经网络非参数建模的。除了新的方法外,我们还通过建立参数系数估计量的根-$n$一致性和渐近正态性以及神经非参数函数估计量的最小最大最优收敛速率为所提出的模型提供了理论证明。通过几个模拟和真实数据示例,我们提出的模型在经验上比各种替代方法产生更好的估计和更准确的预测。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Decomposition of Differences in Distribution under Sample Selection and the Gender Wage Gap Imputation of Counterfactual Outcomes when the Errors are Predicatable Simultaneous Confidence Intervals for Partially Identified Parameters Estimation of the Local Conditional Tail Average Treatment Effect* Forecasting Inflation Using Economic Narratives
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1