论文标题

通过共轭SMO更快的SVM培训

Faster SVM Training via Conjugate SMO

论文作者

Torres-Barrán, Alberto, Alaíz, Carlos, Dorronsoro, José R.

论文摘要

我们根据共轭下降程序提出了用于训练分类和回归SVM的SMO算法的改进版本。这种新方法仅涉及每次迭代的计算成本的适度增加,但反过来通常会导致收敛到给定精度所需的迭代次数大幅减少。此外,我们证明了这种新的共轭SMO的迭代以及当内核矩阵是正定义时的线性速率的融合。我们已经在LIBSVM库中实现了共轭SMO,并在实验上表明,对于许多高参数配置,它通常比二阶Smo更快,而在执行用于SVM调整的网格搜索时。

We propose an improved version of the SMO algorithm for training classification and regression SVMs, based on a Conjugate Descent procedure. This new approach only involves a modest increase on the computational cost of each iteration but, in turn, usually results in a substantial decrease in the number of iterations required to converge to a given precision. Besides, we prove convergence of the iterates of this new Conjugate SMO as well as a linear rate when the kernel matrix is positive definite. We have implemented Conjugate SMO within the LIBSVM library and show experimentally that it is faster for many hyper-parameter configurations, being often a better option than second order SMO when performing a grid-search for SVM tuning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源