论文标题

$(1,λ)$ - 单调功能的自我调整人口大小

Self-adjusting Population Sizes for the $(1, λ)$-EA on Monotone Functions

论文作者

Kaufmann, Marc, Larcher, Maxime, Lengler, Johannes, Zou, Xun

论文摘要

我们研究了$(1,λ)$ -EA,其突变率$ c/n $ for $ c \ le 1 $,其中人口大小通过$(1:s+1)$ - 成功规则自适应地控制。最近,Hevia Fajardo和Sudholt表明,$ C = 1 $的设置在\ onemax上以$ S <1 $有效,但如果$ s \ ge 18 $效率低下。令人惊讶的是,最难的部分不是接近最佳的,而是在线性距离处。我们表明,这种行为不是\ onemax的特定。如果$ s $很小,则该算法在所有单调函数上都是有效的,并且如果$ s $很大,则需要在所有单调功能上进行超多项式时间。在前一种情况下,对于$ c <1 $,我们显示了一代人数的$ O(n)$上限,以及$ O(n \ log n)$用于功能评估的数量,对于$ c = 1 $,我们显示$ O(N \ log n \ log n)$ Generations和$ o(n^2 \ log \ log \ log \ log n)$评估。我们还正式显示,如果算法以最佳距离的距离开始,那么无论$ s $如何,优化始终是快速的。所有结果也存在于动态环境中,适应性函数每一代都会发生变化。

We study the $(1,λ)$-EA with mutation rate $c/n$ for $c\le 1$, where the population size is adaptively controlled with the $(1:s+1)$-success rule. Recently, Hevia Fajardo and Sudholt have shown that this setup with $c=1$ is efficient on \onemax for $s<1$, but inefficient if $s \ge 18$. Surprisingly, the hardest part is not close to the optimum, but rather at linear distance. We show that this behavior is not specific to \onemax. If $s$ is small, then the algorithm is efficient on all monotone functions, and if $s$ is large, then it needs superpolynomial time on all monotone functions. In the former case, for $c<1$ we show a $O(n)$ upper bound for the number of generations and $O(n\log n)$ for the number of function evaluations, and for $c=1$ we show $O(n\log n)$ generations and $O(n^2\log\log n)$ evaluations. We also show formally that optimization is always fast, regardless of $s$, if the algorithm starts in proximity of the optimum. All results also hold in a dynamic environment where the fitness function changes in each generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源