论文标题

从一个好的解决方案开始时,迈向运行时分析的第一步

First Steps Towards a Runtime Analysis When Starting With a Good Solution

论文作者

Antipov, Denis, Buzdalov, Maxim, Doerr, Benjamin

论文摘要

传统上,进化算法的数学运行时分析算法在以随机群体初始化时需要找到一定质量的解决方案。在实际应用中,有可能猜测比随机的解决方案更好。我们开始针对此类情况进行数学运行时分析。我们观察到,不同的算法与更好的初始化的程度不同。我们还表明,算法的最佳参数化可以在很大程度上取决于初始解决方案的质量。为了克服这种困难,自我调整和随机重尾参数可以有利可图。最后,我们观察到我们发现的最佳进化算法的性能与相应的黑盒复杂性之间的差距。这可能表明,进化算法更好地利用了良好的初始解决方案。这些第一个发现源于分析$(1 + 1)$进化算法的性能以及静态,自我调整和重型尾巴的$(1 +(λ,λ))$ ga在Onemax基准上。我们乐观的是,除了这些第一个示例之外,如何从良好的初始解决方案中获利的问题很有趣。

The mathematical runtime analysis of evolutionary algorithms traditionally regards the time an algorithm needs to find a solution of a certain quality when initialized with a random population. In practical applications it may be possible to guess solutions that are better than random ones. We start a mathematical runtime analysis for such situations. We observe that different algorithms profit to a very different degree from a better initialization. We also show that the optimal parameterization of the algorithm can depend strongly on the quality of the initial solutions. To overcome this difficulty, self-adjusting and randomized heavy-tailed parameter choices can be profitable. Finally, we observe a larger gap between the performance of the best evolutionary algorithm we found and the corresponding black-box complexity. This could suggest that evolutionary algorithms better exploiting good initial solutions are still to be found. These first findings stem from analyzing the performance of the $(1+1)$ evolutionary algorithm and the static, self-adjusting, and heavy-tailed $(1 + (λ,λ))$ GA on the OneMax benchmark. We are optimistic that the question how to profit from good initial solutions is interesting beyond these first examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源