论文标题

*-CFQ:在组成任务上分析机器学习的可伸缩性

*-CFQ: Analyzing the Scalability of Machine Learning on a Compositional Task

论文作者

Tsarkov, Dmitry, Tihon, Tibor, Scales, Nathan, Momchev, Nikola, Sinopalnikov, Danila, Schärli, Nathanael

论文摘要

我们提出 *-CFQ(“ star-cfq”):基于CFQ语义解析基准的大规模数据集的一组大规模数据集,旨在将机器学习系统的可扩展性进行原理研究,以现实的组成任务设置。使用此套件,我们进行了一系列实验,研究了在固定计算成本条件下,变压器受益于增加训练规模的能力。我们表明,在所有训练规模上,组成概括仍然是一个挑战,我们表明,自然语言的范围的增加会导致较高的错误率,这仅被增加的培训数据所部分抵消。我们进一步表明,尽管来自相关域的其他训练数据提高了数据饥饿情况的准确性,但随着从相关域到目标域的距离增加,这种改进是有限的,并且会减少。

We present *-CFQ ("star-CFQ"): a suite of large-scale datasets of varying scope based on the CFQ semantic parsing benchmark, designed for principled investigation of the scalability of machine learning systems in a realistic compositional task setting. Using this suite, we conduct a series of experiments investigating the ability of Transformers to benefit from increased training size under conditions of fixed computational cost. We show that compositional generalization remains a challenge at all training sizes, and we show that increasing the scope of natural language leads to consistently higher error rates, which are only partially offset by increased training data. We further show that while additional training data from a related domain improves the accuracy in data-starved situations, this improvement is limited and diminishes as the distance from the related domain to the target domain increases.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源