Robust Estimation of Regression Models with Potentially Endogenous Outliers via a Modern Optimization Lens

Zhan Gao, Hyungsik Roger Moon
{"title":"Robust Estimation of Regression Models with Potentially Endogenous Outliers via a Modern Optimization Lens","authors":"Zhan Gao, Hyungsik Roger Moon","doi":"arxiv-2408.03930","DOIUrl":null,"url":null,"abstract":"This paper addresses the robust estimation of linear regression models in the\npresence of potentially endogenous outliers. Through Monte Carlo simulations,\nwe demonstrate that existing $L_1$-regularized estimation methods, including\nthe Huber estimator and the least absolute deviation (LAD) estimator, exhibit\nsignificant bias when outliers are endogenous. Motivated by this finding, we\ninvestigate $L_0$-regularized estimation methods. We propose systematic\nheuristic algorithms, notably an iterative hard-thresholding algorithm and a\nlocal combinatorial search refinement, to solve the combinatorial optimization\nproblem of the \\(L_0\\)-regularized estimation efficiently. Our Monte Carlo\nsimulations yield two key results: (i) The local combinatorial search algorithm\nsubstantially improves solution quality compared to the initial\nprojection-based hard-thresholding algorithm while offering greater\ncomputational efficiency than directly solving the mixed integer optimization\nproblem. (ii) The $L_0$-regularized estimator demonstrates superior performance\nin terms of bias reduction, estimation accuracy, and out-of-sample prediction\nerrors compared to $L_1$-regularized alternatives. We illustrate the practical\nvalue of our method through an empirical application to stock return\nforecasting.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"7 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - ECON - Econometrics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.03930","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper addresses the robust estimation of linear regression models in the presence of potentially endogenous outliers. Through Monte Carlo simulations, we demonstrate that existing $L_1$-regularized estimation methods, including the Huber estimator and the least absolute deviation (LAD) estimator, exhibit significant bias when outliers are endogenous. Motivated by this finding, we investigate $L_0$-regularized estimation methods. We propose systematic heuristic algorithms, notably an iterative hard-thresholding algorithm and a local combinatorial search refinement, to solve the combinatorial optimization problem of the \(L_0\)-regularized estimation efficiently. Our Monte Carlo simulations yield two key results: (i) The local combinatorial search algorithm substantially improves solution quality compared to the initial projection-based hard-thresholding algorithm while offering greater computational efficiency than directly solving the mixed integer optimization problem. (ii) The $L_0$-regularized estimator demonstrates superior performance in terms of bias reduction, estimation accuracy, and out-of-sample prediction errors compared to $L_1$-regularized alternatives. We illustrate the practical value of our method through an empirical application to stock return forecasting.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过现代优化透镜对具有潜在内生异常值的回归模型进行稳健估计
本文探讨了存在潜在内生异常值时线性回归模型的稳健估计问题。通过蒙特卡罗模拟,我们证明了现有的 $L_1$ 规则化估计方法,包括 Huber 估计器和最小绝对偏差 (LAD) 估计器,在异常值是内生的情况下表现出明显的偏差。受此启发,我们对 L_0$ 规则化估计方法进行了研究。我们提出了系统的启发式算法,特别是迭代硬阈值算法和局部组合搜索改进算法,以有效解决(L_0\)规则化估计的组合优化问题。我们的蒙特卡洛模拟得出了两个关键结果:(i) 与基于初始投影的硬阈值算法相比,局部组合搜索算法大大提高了求解质量,同时比直接求解混合整数优化问题具有更高的计算效率。(ii) 与 L_1$ 规则化替代方法相比,L_0$ 规则化估计器在减少偏差、估计精度和样本外预测误差方面表现出更优越的性能。我们通过股票回报预测的经验应用来说明我们方法的实用价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Simple robust two-stage estimation and inference for generalized impulse responses and multi-horizon causality GPT takes the SAT: Tracing changes in Test Difficulty and Math Performance of Students A Simple and Adaptive Confidence Interval when Nuisance Parameters Satisfy an Inequality Why you should also use OLS estimation of tail exponents On LASSO Inference for High Dimensional Predictive Regression
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1