论文标题

与许多回归剂的预期不足回归的强大估计和推断

Robust Estimation and Inference for Expected Shortfall Regression with Many Regressors

论文作者

He, Xuming, Tan, Kean Ming, Zhou, Wen-Xin

论文摘要

预期的短缺(ES),也称为超品味或有条件的危险价值,已被认为是风险分析和随机优化的重要措施,并且还发现了这些领域以外的应用。在金融中,它指的是资产的有条件预期回报,因为收益低于其分布的一定分数。在本文中,我们考虑了一个最近提出的联合回归框架,该框架同时在给定一组协变量的情况下同时对响应变量的分位数和ES进行了建模,为此,最先进的方法基于最小化非差异性和非convex的关节损失函数。这不可避免地提出了数值挑战,并限制了其用于分析大规模数据的适用性。通过使用Neyman-Ottrothonal得分来降低对滋扰参数的敏感性的想法,我们提出了一种统计上强大的(高度偏斜和重尾数据),并提出了适合关节量化和ES回归模型的计算有效的两步过程。随着协方差尺寸的增加,我们在估计和高斯近似误差上建立了明确的非扰动界限,这为统计推断奠定了基础。最后,我们通过数值实验和两个数据应用来证明我们的方法可以很好地平衡稳健性,统计和数值效率,以期预期的不足回归。

Expected Shortfall (ES), also known as superquantile or Conditional Value-at-Risk, has been recognized as an important measure in risk analysis and stochastic optimization, and is also finding applications beyond these areas. In finance, it refers to the conditional expected return of an asset given that the return is below some quantile of its distribution. In this paper, we consider a recently proposed joint regression framework that simultaneously models the quantile and the ES of a response variable given a set of covariates, for which the state-of-the-art approach is based on minimizing a joint loss function that is non-differentiable and non-convex. This inevitably raises numerical challenges and limits its applicability for analyzing large-scale data. Motivated by the idea of using Neyman-orthogonal scores to reduce sensitivity with respect to nuisance parameters, we propose a statistically robust (to highly skewed and heavy-tailed data) and computationally efficient two-step procedure for fitting joint quantile and ES regression models. With increasing covariate dimensions, we establish explicit non-asymptotic bounds on estimation and Gaussian approximation errors, which lay the foundation for statistical inference. Finally, we demonstrate through numerical experiments and two data applications that our approach well balances robustness, statistical, and numerical efficiencies for expected shortfall regression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源