论文标题
迈向自动贝叶斯优化:涉及采集功能的第一步
Towards Automatic Bayesian Optimization: A first step involving acquisition functions
论文作者
论文摘要
贝叶斯优化是用于优化黑匣子的最先进技术的状态,即我们无法访问其分析表达或梯度的功能,评估它们的昂贵且评估很嘈杂。贝叶斯优化的最流行的应用是机器学习算法的自动高参数调整,我们通过优化这些算法的概括误差的估计来获得机器学习算法的最佳配置。尽管成功地应用了贝叶斯优化方法,但还具有需要配置的超参数,例如概率替代模型或所使用的采集功能。对这些超参数配置的不良决定意味着获得不良质量的结果。通常,这些超参数是通过对我们要评估的目标函数的假设来调整的,但是在某些情况下,我们没有任何有关目标函数的信息。在本文中,我们通过探索几种自动调整贝叶斯优化的采集函数的启发式方法,首次尝试对自动贝叶斯优化的优化。我们说明了这些启示性在一组基准问题和机器学习算法的高参数调谐问题中的有效性。
Bayesian Optimization is the state of the art technique for the optimization of black boxes, i.e., functions where we do not have access to their analytical expression nor its gradients, they are expensive to evaluate and its evaluation is noisy. The most popular application of bayesian optimization is the automatic hyperparameter tuning of machine learning algorithms, where we obtain the best configuration of machine learning algorithms by optimizing the estimation of the generalization error of these algorithms. Despite being applied with success, bayesian optimization methodologies also have hyperparameters that need to be configured such as the probabilistic surrogate model or the acquisition function used. A bad decision over the configuration of these hyperparameters implies obtaining bad quality results. Typically, these hyperparameters are tuned by making assumptions of the objective function that we want to evaluate but there are scenarios where we do not have any prior information about the objective function. In this paper, we propose a first attempt over automatic bayesian optimization by exploring several heuristics that automatically tune the acquisition function of bayesian optimization. We illustrate the effectiveness of these heurisitcs in a set of benchmark problems and a hyperparameter tuning problem of a machine learning algorithm.