论文标题

使用梯度下降的学习函子

Learning Functors using Gradient Descent

论文作者

Gavranović, Bruno

论文摘要

神经网络是可区分优化的一般框架,其中包括许多其他机器学习方法作为特殊情况。在本文中,我们围绕着称为Cyclegan的神经网络系统建立类别理论形式主义。 Cyclegan是一种不配对的图像到图像翻译的通用方法,近年来一直引起人们的注意。受分类数据库系统的启发,我们表明CycleGAN是一个“架构”,即发电机和关系提出的特定类别,其特定参数实例仅在此架构上是设置值的函数。我们表明,执行周期矛盾等于在此类别中执行组成不变。我们将学习过程推广到任意类别,并显示出特殊的功能因素而不是功能,可以使用梯度下降来学习。使用此框架,我们设计了一种能够学习的新型神经网络系统,可以从没有配对数据的情况下从图像中插入和删除对象。我们在Celeba数据集上定性评估该系统并获得有希望的结果。

Neural networks are a general framework for differentiable optimization which includes many other machine learning approaches as special cases. In this paper we build a category-theoretic formalism around a neural network system called CycleGAN. CycleGAN is a general approach to unpaired image-to-image translation that has been getting attention in the recent years. Inspired by categorical database systems, we show that CycleGAN is a "schema", i.e. a specific category presented by generators and relations, whose specific parameter instantiations are just set-valued functors on this schema. We show that enforcing cycle-consistencies amounts to enforcing composition invariants in this category. We generalize the learning procedure to arbitrary such categories and show a special class of functors, rather than functions, can be learned using gradient descent. Using this framework we design a novel neural network system capable of learning to insert and delete objects from images without paired data. We qualitatively evaluate the system on the CelebA dataset and obtain promising results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源