论文标题
量子神经网络中的混乱和复杂性:一项在机器学习中扩散度量的研究
Chaos and Complexity from Quantum Neural Network: A study with Diffusion Metric in Machine Learning
论文作者
论文摘要
在这项工作中,我们的主要目标是研究量子神经网络(QNN)机器学习动力学中量子混乱和复杂性的现象。在混合量子 - 古典框架中的参数化量子电路(PQC)作为通用函数近似值引入,以使用随机梯度下降(SGD)进行优化。我们采用统计和差异几何方法来研究QNN的学习理论。参数化统一运算符的演变与扩散度量中参数的轨迹相关。我们以物理相关的数量来建立量子复杂性和量子混乱的参数化版本,这不仅在确定稳定性时至关重要,而且对于为QNN的概括能力提供非常重要的下限至关重要。我们明确证明,当系统在相空间中执行限制周期或振荡时,QNN的概括能力将最大化。最后,我们确定了使用Cauchy Schwartz不等式在稳态条件下QNN参数方差的概括能力。
In this work, our prime objective is to study the phenomena of quantum chaos and complexity in the machine learning dynamics of Quantum Neural Network (QNN). A Parameterized Quantum Circuits (PQCs) in the hybrid quantum-classical framework is introduced as a universal function approximator to perform optimization with Stochastic Gradient Descent (SGD). We employ a statistical and differential geometric approach to study the learning theory of QNN. The evolution of parametrized unitary operators is correlated with the trajectory of parameters in the Diffusion metric. We establish the parametrized version of Quantum Complexity and Quantum Chaos in terms of physically relevant quantities, which are not only essential in determining the stability, but also essential in providing a very significant lower bound to the generalization capability of QNN. We explicitly prove that when the system executes limit cycles or oscillations in the phase space, the generalization capability of QNN is maximized. Finally, we have determined the generalization capability bound on the variance of parameters of the QNN in a steady state condition using Cauchy Schwartz Inequality.