论文标题
一种基于改良的贝叶斯优化的高参数调谐方法,用于极端梯度提升
A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach for Extreme Gradient Boosting
论文作者
论文摘要
文献中已经报道了机器学习算法的性能受到适当的超参数优化的影响。执行超参数优化的方法之一是通过手动搜索,但这很耗时。执行超参数优化的一些常见方法是使用HyperOPT进行网格搜索随机搜索和贝叶斯优化。在本文中,我们提出了一种全新的方法来改进高参数,即随机杂交,然后调整XGBoost的超参数,即通过应用随机搜索,随机搜索,随机搜索,随机化的hyperopt,hyperopt,hyperopt和网格搜索,在十个数据集上的极端梯度增强算法。通过考虑预测准确性和执行时间,比较了这四种技术中每种技术的性能。我们发现,随机-Hyperopt的性能优于其他三种常规方法,以优化XGBoost。
It is already reported in the literature that the performance of a machine learning algorithm is greatly impacted by performing proper Hyper-Parameter optimization. One of the ways to perform Hyper-Parameter optimization is by manual search but that is time consuming. Some of the common approaches for performing Hyper-Parameter optimization are Grid search Random search and Bayesian optimization using Hyperopt. In this paper, we propose a brand new approach for hyperparameter improvement i.e. Randomized-Hyperopt and then tune the hyperparameters of the XGBoost i.e. the Extreme Gradient Boosting algorithm on ten datasets by applying Random search, Randomized-Hyperopt, Hyperopt and Grid Search. The performances of each of these four techniques were compared by taking both the prediction accuracy and the execution time into consideration. We find that the Randomized-Hyperopt performs better than the other three conventional methods for hyper-paramter optimization of XGBoost.