论文标题
Stochastastrank:全局优化无标度离散功能
StochasticRank: Global Optimization of Scale-Free Discrete Functions
论文作者
论文摘要
在本文中,我们引入了一个强大而有效的框架,以直接优化排名指标。由于损失的离散结构,问题不足,因此,我们引入了两种重要技术:基于部分整合的随机平滑和新颖的梯度估计。我们表明,经典的平滑方法可能会引入偏见,并提出一种通用解决方案,以实现适当的歧义。重要的是,我们可以通过采用最近提出的随机梯度Langevin促进算法来保证我们方法的全球收敛。我们的算法作为Catboost梯度增强库的一部分实现,并在几个学习到级数据集中的现有方法胜过。除了排名指标外,我们的框架还适用于任何无标度离散损失函数。
In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics. The problem is ill-posed due to the discrete structure of the loss, and to deal with that, we introduce two important techniques: stochastic smoothing and novel gradient estimate based on partial integration. We show that classic smoothing approaches may introduce bias and present a universal solution for a proper debiasing. Importantly, we can guarantee global convergence of our method by adopting a recently proposed Stochastic Gradient Langevin Boosting algorithm. Our algorithm is implemented as a part of the CatBoost gradient boosting library and outperforms the existing approaches on several learning-to-rank datasets. In addition to ranking metrics, our framework applies to any scale-free discrete loss function.