论文标题

贝叶斯优化中的加性树结构的条件参数空间:一种新颖的协方差函数和快速实现

Additive Tree-Structured Conditional Parameter Spaces in Bayesian Optimization: A Novel Covariance Function and a Fast Implementation

论文作者

Ma, Xingchen, Blaschko, Matthew B.

论文摘要

贝叶斯优化(BO)是用于评估昂贵的黑框函数的样品有效的全局优化算法。关于条件参数空间中基于模型的优化的现有文献通常建立在树上。在这项工作中,我们将添加剂假设推广到树结构函数,并提出一个加性树结构的协方差函数,显示出提高的样品效率,更广泛的适用性和更大的灵活性。此外,通过将参数空间的结构信息和BO循环中的加性假设结合在一起,我们开发了一种并行算法来优化采集函数,并且可以在低维空间中执行此优化。我们在修剪预训练的VGG16和RESNET50模型以及搜索RESNET20的激活功能方面,在修剪预训练的VGG16和RESNET50模型方面演示了有关神经网络压缩问题的优化基准函数的方法。实验结果表明,我们的方法明显优于有条件参数优化的当前最新水平状态,包括SMAC,TPE和Jenatton等。 (2017)。

Bayesian optimization (BO) is a sample-efficient global optimization algorithm for black-box functions which are expensive to evaluate. Existing literature on model based optimization in conditional parameter spaces are usually built on trees. In this work, we generalize the additive assumption to tree-structured functions and propose an additive tree-structured covariance function, showing improved sample-efficiency, wider applicability and greater flexibility. Furthermore, by incorporating the structure information of parameter spaces and the additive assumption in the BO loop, we develop a parallel algorithm to optimize the acquisition function and this optimization can be performed in a low dimensional space. We demonstrate our method on an optimization benchmark function, on a neural network compression problem, on pruning pre-trained VGG16 and ResNet50 models as well as on searching activation functions of ResNet20. Experimental results show our approach significantly outperforms the current state of the art for conditional parameter optimization including SMAC, TPE and Jenatton et al. (2017).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源