论文标题

从两类线性判别分析到可解释的多层概念设计

From Two-Class Linear Discriminant Analysis to Interpretable Multilayer Perceptron Design

论文作者

Lin, Ruiyuan, Zhou, Zhiruo, You, Suya, Rao, Raghuveer, Kuo, C. -C. Jay

论文摘要

两类线性判别分析(LDA)中存在封闭形式的解决方案,该分析在多维特征空间中判断两个高斯分布的类别。在这项工作中,我们将多层感知器(MLP)解释为两级LDA系统的概括,以便它可以处理由多个属于多个类别的多个高斯模态组成的输入。除了输入层$ l_ {in} $和输出层$ l_ {out} $外,利息的MLP由两个中间层组成,$ l_1 $和$ l_2 $。我们提出了一个有三个阶段的前馈设计:1)从$ l_ {in} $到$ l_1 $:由多个平行LDAS完成的半空间分区,从$ l_1 $到$ l_2 $:subspace隔离:一个高斯模式由一个神经元代表一个$ l_2 $ l_2 $ l_2 $ l_ $ l_ $ l_} $。高斯形态连接到其目标类别。通过此过程,我们提出了一种自动MLP设计,该设计可以指定网络体系结构(即,一层的层编号和神经元编号)和所有滤波器重量都以馈电的方式进行。通过利用高斯混合模型(GMM),可以将此设计推广到任意分布。进行实验以比较传统的基于反向传播的MLP(BP-MLP)和新的前馈MLP(FF-MLP)的性能。

A closed-form solution exists in two-class linear discriminant analysis (LDA), which discriminates two Gaussian-distributed classes in a multi-dimensional feature space. In this work, we interpret the multilayer perceptron (MLP) as a generalization of a two-class LDA system so that it can handle an input composed by multiple Gaussian modalities belonging to multiple classes. Besides input layer $l_{in}$ and output layer $l_{out}$, the MLP of interest consists of two intermediate layers, $l_1$ and $l_2$. We propose a feedforward design that has three stages: 1) from $l_{in}$ to $l_1$: half-space partitionings accomplished by multiple parallel LDAs, 2) from $l_1$ to $l_2$: subspace isolation where one Gaussian modality is represented by one neuron, 3) from $l_2$ to $l_{out}$: class-wise subspace mergence, where each Gaussian modality is connected to its target class. Through this process, we present an automatic MLP design that can specify the network architecture (i.e., the layer number and the neuron number at a layer) and all filter weights in a feedforward one-pass fashion. This design can be generalized to an arbitrary distribution by leveraging the Gaussian mixture model (GMM). Experiments are conducted to compare the performance of the traditional backpropagation-based MLP (BP-MLP) and the new feedforward MLP (FF-MLP).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源