论文标题

生长二维部分功能线性模型的非反应性最佳预测误差

Non-asymptotic Optimal Prediction Error for Growing-dimensional Partially Functional Linear Models

论文作者

Zhang, Huiming, Lei, Xiaoyu

论文摘要

在繁殖的内核希尔伯特空间(RKHS)下,我们考虑了部分功能性线性模型(PFLM)的惩罚最小二乘,其预测变量包含功能和传统的多元部分,并且多元部分允许参数数量不同。从非反应的角度来看,我们专注于预测误差的最佳上和下限。在更一般的假设下,以非质子形式显示了超额预测风险的确切上限,称为模型的有效维度,当多元协变量的数量随着样本大小$ n $而略微增加时,我们还通过该模型显示了预测一致性。我们的新发现意味着在非功能预测因子的数量和内核主要组件的有效维度之间取消了权衡,以确保预测在增加维度的环境中的一致性。证明中的分析取决于协方差操作员和繁殖核的夹心操作员的光谱状况,以及希尔伯特空间中随机元素的下元素和贝尔斯坦浓度不平等。最后,我们在模型的kullback-leibler差异的规律性假设下得出了非轴突最小值下限。

Under the reproducing kernel Hilbert spaces (RKHS), we consider the penalized least-squares of the partially functional linear models (PFLM), whose predictor contains both functional and traditional multivariate parts, and the multivariate part allows a divergent number of parameters. From the non-asymptotic point of view, we focus on the rate-optimal upper and lower bounds of the prediction error. An exact upper bound for the excess prediction risk is shown in a non-asymptotic form under a more general assumption known as the effective dimension to the model, by which we also show the prediction consistency when the number of multivariate covariates $p$ slightly increases with the sample size $n$. Our new finding implies a trade-off between the number of non-functional predictors and the effective dimension of the kernel principal components to ensure prediction consistency in the increasing-dimensional setting. The analysis in our proof hinges on the spectral condition of the sandwich operator of the covariance operator and the reproducing kernel, and on sub-Gaussian and Berstein concentration inequalities for the random elements in Hilbert space. Finally, we derive the non-asymptotic minimax lower bound under the regularity assumption of the Kullback-Leibler divergence of the models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源