论文标题
用不对称内核学习:最小二乘和特征解释
Learning with Asymmetric Kernels: Least Squares and Feature Interpretation
论文作者
论文摘要
不对称内核自然存在于现实生活中,例如,有条件的概率和定向图。但是,大多数现有的基于内核的学习方法都需要对称内核,从而阻止使用不对称内核。本文介绍了最小二乘支持向量机的框架中基于不对称的内核的学习,这是一种可以直接利用不对称内核的第一种分类方法。我们将证明Ask-LS可以通过不对称的功能(即源和目标特征)学习,而核技巧仍然适用,即存在源和目标特征,但不一定是知道的。此外,Ask-LS的计算负担与处理对称内核一样便宜。 Corel数据库,有向图和UCI数据库的实验结果将表明,在不对称信息至关重要的情况下,拟议的Ask-LS可以使用不对称内核学习,并且要比现有的内核方法要比必须进行对称的现有内核方法要好得多,以适应不对称的内核。
Asymmetric kernels naturally exist in real life, e.g., for conditional probability and directed graphs. However, most of the existing kernel-based learning methods require kernels to be symmetric, which prevents the use of asymmetric kernels. This paper addresses the asymmetric kernel-based learning in the framework of the least squares support vector machine named AsK-LS, resulting in the first classification method that can utilize asymmetric kernels directly. We will show that AsK-LS can learn with asymmetric features, namely source and target features, while the kernel trick remains applicable, i.e., the source and target features exist but are not necessarily known. Besides, the computational burden of AsK-LS is as cheap as dealing with symmetric kernels. Experimental results on the Corel database, directed graphs, and the UCI database will show that in the case asymmetric information is crucial, the proposed AsK-LS can learn with asymmetric kernels and performs much better than the existing kernel methods that have to do symmetrization to accommodate asymmetric kernels.