论文标题

训练机学习模型的QUBO配方

QUBO Formulations for Training Machine Learning Models

论文作者

Date, Prasanna, Arthur, Davis, Pusey-Nazzaro, Lauren

论文摘要

经典计算机上的培训机学习模型通常是一个时间和计算密集过程。随着摩尔定律结束,使用机器学习对大规模数据分析的需求不断增长,我们必须利用量子计算(例如量子计算)的非惯性计算范式有效地培训机器学习模型。像D-Wave 2000Q这样的绝热量子计算机大约可以解决NP硬化优化问题,例如二次无约束的二进制优化(QUBO),比古典计算机快。由于许多机器学习问题也是NP-HARD,因此我们认为绝热量子计算机在摩尔后的法律时代有效地有效地训练机器学习模型。为了解决绝热量子计算机上的问题,必须将其作为QUBO问题提出,这本身就是一个具有挑战性的任务。在本文中,我们制定了三种机器学习模型的训练问题---线性回归,支持向量机(SVM)和等级的K-均值聚类 - 作为QUBO问题,以便可以有效地对它们进行培训。我们还分析了我们的配方的时间和空间复杂性,并将它们与用于培训这些机器学习模型的最先进的经典算法进行了比较。我们表明,我们公式的时间和空间复杂性更好(对于SVM和等尺寸的K-均值聚类)或等效(如果是线性回归)的经典对应物。

Training machine learning models on classical computers is usually a time and compute intensive process. With Moore's law coming to an end and ever increasing demand for large-scale data analysis using machine learning, we must leverage non-conventional computing paradigms like quantum computing to train machine learning models efficiently. Adiabatic quantum computers like the D-Wave 2000Q can approximately solve NP-hard optimization problems, such as the quadratic unconstrained binary optimization (QUBO), faster than classical computers. Since many machine learning problems are also NP-hard, we believe adiabatic quantum computers might be instrumental in training machine learning models efficiently in the post Moore's law era. In order to solve a problem on adiabatic quantum computers, it must be formulated as a QUBO problem, which is a challenging task in itself. In this paper, we formulate the training problems of three machine learning models---linear regression, support vector machine (SVM) and equal-sized k-means clustering---as QUBO problems so that they can be trained on adiabatic quantum computers efficiently. We also analyze the time and space complexities of our formulations and compare them to the state-of-the-art classical algorithms for training these machine learning models. We show that the time and space complexities of our formulations are better (in the case of SVM and equal-sized k-means clustering) or equivalent (in case of linear regression) to their classical counterparts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源