论文标题

优化方法对分布外检测方法鲁棒性的影响

The Effect of Optimization Methods on the Robustness of Out-of-Distribution Detection Approaches

论文作者

Abdelzad, Vahdat, Czarnecki, Krzysztof, Salay, Rick

论文摘要

深度神经网络(DNN)已成为不同领域中的事实上的学习机制。他们在分发(OOD)方面不可行执行的趋势阻碍了他们在关键领域的采用。已经提出了几种检测OOD输入的方法。但是,现有的方法仍然缺乏鲁棒性。在本文中,我们通过揭示优化方法的重要作用来阐明OOD检测方法(OODD)方法的鲁棒性。我们表明,OODD方法对训练深模型中使用的优化方法类型敏感。优化方法可以为非凸问题提供不同的解决方案,因此这些解决方案可能会或可能无法满足OODD方法所做的假设(例如,深度特征的分布)。此外,我们提出了一个鲁棒性得分,该得分考虑了优化方法的作用。这提供了一种比较OODD方法的合理方法。除了使用我们提出的鲁棒性得分比较几种OODD方法外,我们还证明了一些优化方法为OODD方法提供了更好的解决方案。

Deep neural networks (DNNs) have become the de facto learning mechanism in different domains. Their tendency to perform unreliably on out-of-distribution (OOD) inputs hinders their adoption in critical domains. Several approaches have been proposed for detecting OOD inputs. However, existing approaches still lack robustness. In this paper, we shed light on the robustness of OOD detection (OODD) approaches by revealing the important role of optimization methods. We show that OODD approaches are sensitive to the type of optimization method used during training deep models. Optimization methods can provide different solutions to a non-convex problem and so these solutions may or may not satisfy the assumptions (e.g., distributions of deep features) made by OODD approaches. Furthermore, we propose a robustness score that takes into account the role of optimization methods. This provides a sound way to compare OODD approaches. In addition to comparing several OODD approaches using our proposed robustness score, we demonstrate that some optimization methods provide better solutions for OODD approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源