论文标题
多个随机特征模型中的多个下降
Multiple Descent in the Multiple Random Feature Model
论文作者
论文摘要
最近的作品表明,在过度参数化的学习中表现出了双重下降现象。尽管最近的作品已经对此现象进行了研究,但理论上尚未完全理解。在本文中,我们研究了一类多组分预测模型中的多重下降现象。我们首先考虑一个“双重随机特征模型”(DRFM)串联两种随机特征,并研究Ridge回归中DRFM实现的多余风险。我们计算高维框架下的多余风险的确切限制,其中训练样本量,数据维度和随机特征的维度往往是无限的。基于计算,我们从理论上进一步证明了DRFM的风险曲线可以表现出三重下降。然后,我们提供一项彻底的实验研究来验证我们的理论。最后,我们将研究扩展到“多个随机功能模型”(MRFM),并表明MRFMS结合$ K $类型的随机特征可能会展示$(K+1)$ - 折叠下降。我们的分析指出,在学习多组分预测模型中通常存在具有特定数量下降数量的风险曲线。
Recent works have demonstrated a double descent phenomenon in over-parameterized learning. Although this phenomenon has been investigated by recent works, it has not been fully understood in theory. In this paper, we investigate the multiple descent phenomenon in a class of multi-component prediction models. We first consider a ''double random feature model'' (DRFM) concatenating two types of random features, and study the excess risk achieved by the DRFM in ridge regression. We calculate the precise limit of the excess risk under the high dimensional framework where the training sample size, the dimension of data, and the dimension of random features tend to infinity proportionally. Based on the calculation, we further theoretically demonstrate that the risk curves of DRFMs can exhibit triple descent. We then provide a thorough experimental study to verify our theory. At last, we extend our study to the ''multiple random feature model'' (MRFM), and show that MRFMs ensembling $K$ types of random features may exhibit $(K+1)$-fold descent. Our analysis points out that risk curves with a specific number of descent generally exist in learning multi-component prediction models.