论文标题

在线超参数搜索与近端参数更新交织在一起

Online Hyperparameter Search Interleaved with Proximal Parameter Updates

论文作者

Lopez-Ramos, Luis Miguel, Beferull-Lozano, Baltasar

论文摘要

显然需要有效的算法来调整统计学习方案的超参数,因为常用的搜索方法(例如带有n折面交叉验证的网格搜索方法)效率低下和/或近似。以前现有的算法有效地搜索依赖成本函数平滑度的超参数的算法不能应用于诸如套索回归等问题。 在此贡献中,我们开发了一种依赖近端梯度方法结构的超参数优化方法,不需要平稳的成本函数。这种方法应用于保留的(LOO)验证的套索和组套索,以产生有效的,数据驱动的,超参数优化算法。 数值实验证实了所提出的方法对LOO验证误差曲线的局部最佳曲线的收敛性及其近似值的效率。

There is a clear need for efficient algorithms to tune hyperparameters for statistical learning schemes, since the commonly applied search methods (such as grid search with N-fold cross-validation) are inefficient and/or approximate. Previously existing algorithms that efficiently search for hyperparameters relying on the smoothness of the cost function cannot be applied in problems such as Lasso regression. In this contribution, we develop a hyperparameter optimization method that relies on the structure of proximal gradient methods and does not require a smooth cost function. Such a method is applied to Leave-one-out (LOO)-validated Lasso and Group Lasso to yield efficient, data-driven, hyperparameter optimization algorithms. Numerical experiments corroborate the convergence of the proposed method to a local optimum of the LOO validation error curve, and the efficiency of its approximations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源