论文标题

在54分钟内加速了BERT预处理的大批次优化

Accelerated Large Batch Optimization of BERT Pretraining in 54 minutes

论文作者

Zheng, Shuai, Lin, Haibin, Zha, Sheng, Li, Mu

论文摘要

伯特(Bert)最近在自然语言理解(NLU)上引起了很多关注,并获得了最先进的结果,从而完成了各种NLU任务。但是,其成功需要大量的深层神经网络和大量数据,从而导致较长的培训时间并阻碍发展的进步。使用具有大型迷你批量的随机梯度方法作为减少训练时间的有效工具。沿着这一研究,羔羊是一个重要的例子,在TPUV3 POD上将BERT的训练时间从3天减少到76分钟。在本文中,我们提出了一种称为LAN的加速梯度方法,以提高使用大型迷你批次进行训练的效率。由于学习率在理论上是由函数的Lipschitz常数的倒数界定的,因此不能始终通过选择较大的学习率来减少优化迭代的数量。为了使用较大的迷你批量尺寸而不会准确损失,我们开发了一个新的学习率调度程序,该调度程序克服了使用大型学习率的困难。使用拟议的LAN方法和学习率方案,我们分别在BERT预告片的第1阶段和第2阶段将迷你批量缩放到96K和33K。在192 AWS EC2 P3DN.24XLARGE实例上,目标F1得分为90.5或更高,在Squad v1.1上达到90.5或更高,达到了云中最快的BERT训练时间。

BERT has recently attracted a lot of attention in natural language understanding (NLU) and achieved state-of-the-art results in various NLU tasks. However, its success requires large deep neural networks and huge amount of data, which result in long training time and impede development progress. Using stochastic gradient methods with large mini-batch has been advocated as an efficient tool to reduce the training time. Along this line of research, LAMB is a prominent example that reduces the training time of BERT from 3 days to 76 minutes on a TPUv3 Pod. In this paper, we propose an accelerated gradient method called LANS to improve the efficiency of using large mini-batches for training. As the learning rate is theoretically upper bounded by the inverse of the Lipschitz constant of the function, one cannot always reduce the number of optimization iterations by selecting a larger learning rate. In order to use larger mini-batch size without accuracy loss, we develop a new learning rate scheduler that overcomes the difficulty of using large learning rate. Using the proposed LANS method and the learning rate scheme, we scaled up the mini-batch sizes to 96K and 33K in phases 1 and 2 of BERT pretraining, respectively. It takes 54 minutes on 192 AWS EC2 P3dn.24xlarge instances to achieve a target F1 score of 90.5 or higher on SQuAD v1.1, achieving the fastest BERT training time in the cloud.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源