论文标题

MBGD-RDA培训和规则修剪简明TSK模糊回归模型

MBGD-RDA Training and Rule Pruning for Concise TSK Fuzzy Regression Models

论文作者

Wu, Dongrui

论文摘要

最近提出了有效训练回归问题的高核康甘甘果(TSK)模糊系统,最近提出了一种带有正则化,下降和Adabound(MBGD-RDA)算法的迷你批次梯度下降。它表现出了出色的表现;但是,也存在一些限制,例如,它不允许用户直接指定规则数​​,并且只能使用高斯MF。本文提出了MBGD-RDA的两种变体来解决这些限制,并表明它们的表现优于原始MBGD-RDA和具有相同数量规则的经典ANFIS算法。此外,我们还提出了针对TSK模糊系统修剪算法的规则,该算法可以减少规则的数量而不会显着牺牲回归性能。实验表明,从修剪获得的规则通常比直接从头开始训练它们更好,尤其是在使用高斯MF时。

To effectively train Takagi-Sugeno-Kang (TSK) fuzzy systems for regression problems, a Mini-Batch Gradient Descent with Regularization, DropRule, and AdaBound (MBGD-RDA) algorithm was recently proposed. It has demonstrated superior performances; however, there are also some limitations, e.g., it does not allow the user to specify the number of rules directly, and only Gaussian MFs can be used. This paper proposes two variants of MBGD-RDA to remedy these limitations, and show that they outperform the original MBGD-RDA and the classical ANFIS algorithms with the same number of rules. Furthermore, we also propose a rule pruning algorithm for TSK fuzzy systems, which can reduce the number of rules without significantly sacrificing the regression performance. Experiments showed that the rules obtained from pruning are generally better than training them from scratch directly, especially when Gaussian MFs are used.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源