论文标题
杰克和所有行业的主人:一组从大型预培训模型的模型集
Jack and Masters of all Trades: One-Pass Learning Sets of Model Sets From Large Pre-Trained Models
论文作者
论文摘要
对于深度学习,大小就是力量。接受广泛数据训练的大量神经网是人工智能的最前沿。这些大型预训练的模型或所有行业的千斤顶(JAT)(JATS)进行了微调,以进行下游任务,在推动深度学习进步方面非常重要。但是,具有严格资源限制的环境,目标和意图不断变化或任务要求各异,可能会限制单数JAT的实际实用程序。因此,本文与当前建立越来越大的Jats的趋势同时进行了对概念的初步探索,该概念是创建各种紧凑的机器学习模型集的基础。由许多较小和专业的模型组成,一组设置旨在同时满足许多任务设置和环境条件。首次提出了在神经进化多任务算法的一遍中,在一次传球中进行此类设置的一种手段,这使我们更接近与所有行业的总体大师的模型。
For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These large pre-trained models or Jacks of All Trades (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, the Set of Sets is formulated to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively Masters of All Trades.