论文标题
通用培训计划和机器学习阶段的总体普遍性
A universal training scheme and the resulting universality for machine learning phases
论文作者
论文摘要
自动编码器(AE)和生成对抗网络(GAN)仅在200个站点的一维(1D)晶格上接受一次训练。此外,AE仅包含一个由两个神经元组成的隐藏层,而GAN的发生器和歧视器也由两个神经元组成。用于训练两个无监督神经网络(NN)的训练集均由两种人工配置组成。值得注意的是,尽管它们的架构很简单,但构建的AE和GAN都可以精确地确定了多种模型的关键点,包括三维(3D)经典$ O(3)$模型,二维(2D)经典的XY XY型号,2d Tw-State Potts模型和1D Bose-Hubbard模型。此处介绍的结果以及{\ IT eur中所示的结果。物理。 J. Plus {\ bf 136},1116(2021)}表明,当考虑相变时,可以轻松构建一个非常有效且适用于广泛物理系统的优雅通用神经网络。特别是,由于可以将两种配置训练的NN应用于许多模型,因此,当机器学习涉及时,大多数相转换属于具有两个元素的类,即Ising类。
An autoencoder (AE) and a generative adversarial networks (GANs) are trained only once on a one-dimensional (1D) lattice of 200 sites. Moreover, the AE contains only one hidden layer consisting of two neurons and both the generator and the discriminator of the GANs are made up of two neurons as well. The training set employed to train both the considered unsupervised neural networks (NN) is composed of two artificial configurations. Remarkably, despite their simple architectures, both the built AE and GANs have precisely determined the critical points of several models, including the three-dimensional (3D) classical $O(3)$ model, the two-dimensional (2D) generalized classical XY model, the 2D two-state Potts model, and the 1D Bose-Hubbard model. The results presented here as well as that shown in {\it Eur. Phys. J. Plus {\bf 136}, 1116 (2021)} suggest that when phase transitions are considered, an elegant universal neural network that is extremely efficient and is applicable to broad physical systems can be constructed with ease. In particular, since a NN trained with two configurations can be applied to many models, it is likely that when machine learning is concerned, the majority of phase transitions belong to a class having two elements, i.e. the Ising class.