论文标题
与重尾噪声的凸成分问题几乎最佳的强大方法
Nearly Optimal Robust Method for Convex Compositional Problems with Heavy-Tailed Noise
论文作者
论文摘要
在本文中,我们提出了鲁棒的随机算法,以解决表格$ f(\e_ξg(\ cdot;ξ)) + r(\ cdot)$的凸组成问题,通过在噪声分布的较弱的范围内建立{\ bf subgaussian置信度},以建立{\ bf subgaussian置信度},以;可以通过使用现有的增强策略来实现这一目标,从而将低概率收敛性结果提高到概率很高。但是,将现有结果拼凑为解决组成问题的结果有几个缺点:(i)提升技术需要强大的目标凸度; (ii)它需要单独的算法来处理非平滑$ r $; (iii)它还遭受了条件数的附加多组分子因子。为了解决这些问题,我们直接开发了一种单次随机算法,以最大程度地减少最佳强烈凸组成目标,该目标具有几乎最佳的高概率收敛性,因此与随机凸的下限相匹配,强烈凸出优化到对数因子。据我们所知,这是第一批建立在重尾假设下几乎为组成问题的最佳次高斯置信度范围的工作。
In this paper, we propose robust stochastic algorithms for solving convex compositional problems of the form $f(\E_ξg(\cdot; ξ)) + r(\cdot)$ by establishing {\bf sub-Gaussian confidence bounds} under weak assumptions about the tails of noise distribution, i.e., {\bf heavy-tailed noise} with bounded second-order moments. One can achieve this goal by using an existing boosting strategy that boosts a low probability convergence result into a high probability result. However, piecing together existing results for solving compositional problems suffers from several drawbacks: (i) the boosting technique requires strong convexity of the objective; (ii) it requires a separate algorithm to handle non-smooth $r$; (iii) it also suffers from an additional polylogarithmic factor of the condition number. To address these issues, we directly develop a single-trial stochastic algorithm for minimizing optimal strongly convex compositional objectives, which has a nearly optimal high probability convergence result matching the lower bound of stochastic strongly convex optimization up to a logarithmic factor. To the best of our knowledge, this is the first work that establishes nearly optimal sub-Gaussian confidence bounds for compositional problems under heavy-tailed assumptions.