论文标题

几次学习的同时扰动随机近似

Simultaneous Perturbation Stochastic Approximation for Few-Shot Learning

论文作者

Boiarov, Andrei, Granichin, Oleg, Granichina, Olga

论文摘要

很少有学习的学习是一个重要的研究领域的机器学习领域,在该领域中,必须对分类器进行培训,以使其可以适应不包括在培训集中的新课程。但是,每个班级只有少量的示例可以进行培训。这是这种类型的学习算法的关键问题之一,它导致了重大的不确定性。我们通过随机随机近似攻击此问题。在本文中,我们建议考虑新的多任务损失函数,并根据原型网络方法提出类似SPSA的几局学习方法。我们提供了理论上的理由和对这种方法的实验分析。基准数据集上的实验结果表明,所提出的方法优于原型原型网络。

Few-shot learning is an important research field of machine learning in which a classifier must be trained in such a way that it can adapt to new classes which are not included in the training set. However, only small amounts of examples of each class are available for training. This is one of the key problems with learning algorithms of this type which leads to the significant uncertainty. We attack this problem via randomized stochastic approximation. In this paper, we suggest to consider the new multi-task loss function and propose the SPSA-like few-shot learning approach based on the prototypical networks method. We provide a theoretical justification and an analysis of experiments for this approach. The results of experiments on the benchmark dataset demonstrate that the proposed method is superior to the original prototypical networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源