论文标题
本地自适应非参数在线学习
Locally-Adaptive Nonparametric Online Learning
论文作者
论文摘要
在线算法的主要优势之一是它们适应任意数据序列的能力。这在非参数设置中尤其重要,在非参数设置中,在能够适合复杂环境的丰富比较器函数中衡量了性能。尽管这种艰难的比较器和复杂的环境可能表现出局部规律性,但鲜为人知的有效算法可以证明可以利用这些局部模式。我们通过引入有效的在线算法(基于单个多功能主算法)来填补这一空白,每个算法适应以下规律之一:(i)竞争者功能的本地lipschitzness,(ii)实例序列的局部度量维度,(iii)实例空间不同区域的局部预测器的本地性能。扩展了以前的方法,我们设计了在实例空间上动态生长层次$ε$ - NET的算法,其修剪量对应于手头问题的不同“局部概况”。使用基于树木专家的技术,我们同时有效地与所有这些修剪来竞争,并证明了与不同类型的局部规律性相关的数量的遗憾范围。当与“简单”局部性概况竞争时,我们的技术会提供遗憾的界限,这些界限明显优于使用先前方法证明的界限。另一方面,我们的界限的时间依赖性并不比忽略任何当地规律的时间差。
One of the main strengths of online algorithms is their ability to adapt to arbitrary data sequences. This is especially important in nonparametric settings, where performance is measured against rich classes of comparator functions that are able to fit complex environments. Although such hard comparators and complex environments may exhibit local regularities, efficient algorithms, which can provably take advantage of these local patterns, are hardly known. We fill this gap by introducing efficient online algorithms (based on a single versatile master algorithm) each adapting to one of the following regularities: (i) local Lipschitzness of the competitor function, (ii) local metric dimension of the instance sequence, (iii) local performance of the predictor across different regions of the instance space. Extending previous approaches, we design algorithms that dynamically grow hierarchical $ε$-nets on the instance space whose prunings correspond to different "locality profiles" for the problem at hand. Using a technique based on tree experts, we simultaneously and efficiently compete against all such prunings, and prove regret bounds each scaling with a quantity associated with a different type of local regularity. When competing against "simple" locality profiles, our technique delivers regret bounds that are significantly better than those proven using the previous approach. On the other hand, the time dependence of our bounds is not worse than that obtained by ignoring any local regularities.