论文标题
随机加权,未经训练的神经张量网络具有更大的关系表达性
Randomly Weighted, Untrained Neural Tensor Networks Achieve Greater Relational Expressiveness
论文作者
论文摘要
结构构成实体对之间关系程度的神经张量网络(NTN)用于逻辑张量网络(LTN),以促进一阶逻辑中的统计关系学习(SRL)。在本文中,我们提出了随机加权的张量网络(RWTN),该网络将随机绘制的未经训练的张量纳入具有训练有素的解码器网络的NTN编码网络。我们表明,RWTNS符合或超过传统训练的LTN进行语义图像解释(SII)任务的表现,这些任务已被用作LTNS如何利用一阶逻辑上的推理以超过完全数据驱动方法的性能。我们证明,RWTNS优于检测对象之间相关部分关系的LTNS,并且我们表明RWTN可以实现与对象分类的LTN相似的性能,同时使用更少的参数来学习。此外,我们证明,由于随机权重不取决于数据,因此几个解码器网络可以共享一个NTN,从而为RWTN提供了独特的空间规模经济,用于同时分类任务。
Neural Tensor Networks (NTNs), which are structured to encode the degree of relationship among pairs of entities, are used in Logic Tensor Networks (LTNs) to facilitate Statistical Relational Learning (SRL) in first-order logic. In this paper, we propose Randomly Weighted Tensor Networks (RWTNs), which incorporate randomly drawn, untrained tensors into an NTN encoder network with a trained decoder network. We show that RWTNs meet or surpass the performance of traditionally trained LTNs for Semantic Image Interpretation (SII) tasks that have been used as a representative example of how LTNs utilize reasoning over first-order logic to exceed the performance of solely data-driven methods. We demonstrate that RWTNs outperform LTNs for the detection of the relevant part-of relations between objects, and we show that RWTNs can achieve similar performance as LTNs for object classification while using fewer parameters for learning. Furthermore, we demonstrate that because the randomized weights do not depend on the data, several decoder networks can share a single NTN, giving RWTNs a unique economy of spatial scale for simultaneous classification tasks.