论文标题

可解释决策树的进化学习

Evolutionary learning of interpretable decision trees

论文作者

Custode, Leonardo Lucio, Iacca, Giovanni

论文摘要

在过去的十年中,强化学习技术在多个任务中实现了人类水平的表现。但是,近年来,出现了对解释性的需求:我们希望能够理解系统的工作原理以及其决策背后的原因。我们不仅需要可解释性来评估生产系统的安全性,而且还需要它来提取有关未知问题的知识。尽管确实存在一些优化决策树以优化决策树的技术,但它们通常采用贪婪的算法,或者不利用环境给予的奖励。这意味着这些技术可能很容易陷入本地Optima。在这项工作中,我们提出了一种使用决策树的可解释强化学习的新颖方法。我们提出了一个两级优化方案,该方案将进化算法的优势与Q学习的优势结合在一起。这样,我们将问题分解为两个子问题:找到对状态空间的有意义且有用的分解的问题,以及将动作与每个状态相关联的问题。我们在三个众所周知的增强学习基准上测试了提出的方法,在绩效和可解释性方面,它在最先进的方面都会导致竞争。最后,我们进行了一项消融研究,该研究确认使用两级优化方案可以在非平凡环境中相对于一层优化技术的性能提高。

Reinforcement learning techniques achieved human-level performance in several tasks in the last decade. However, in recent years, the need for interpretability emerged: we want to be able to understand how a system works and the reasons behind its decisions. Not only we need interpretability to assess the safety of the produced systems, we also need it to extract knowledge about unknown problems. While some techniques that optimize decision trees for reinforcement learning do exist, they usually employ greedy algorithms or they do not exploit the rewards given by the environment. This means that these techniques may easily get stuck in local optima. In this work, we propose a novel approach to interpretable reinforcement learning that uses decision trees. We present a two-level optimization scheme that combines the advantages of evolutionary algorithms with the advantages of Q-learning. This way we decompose the problem into two sub-problems: the problem of finding a meaningful and useful decomposition of the state space, and the problem of associating an action to each state. We test the proposed method on three well-known reinforcement learning benchmarks, on which it results competitive with respect to the state-of-the-art in both performance and interpretability. Finally, we perform an ablation study that confirms that using the two-level optimization scheme gives a boost in performance in non-trivial environments with respect to a one-layer optimization technique.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源