论文标题

SITGRU:单核封闭式复发单元,用于异常检测

SiTGRU: Single-Tunnelled Gated Recurrent Unit for Abnormality Detection

论文作者

Fanta, Habtamu, Shao, Zhiwen, Ma, Lizhuang

论文摘要

由于对特定上下文的依赖和实际场景的无约束性变异性,异常检测是一项具有挑战性的任务。近年来,它受益于深度神经网络所学到的强大功能,以及专门用于异常探测器的手工制作功能。但是,这些具有较大复杂性的方法在处理长期顺序数据(例如视频)时仍存在局限性,并且他们学到的功能并未彻底捕获有用的信息。复发性神经网络(RNN)已被证明能够在长期序列中稳健地处理时间数据。在本文中,我们提出了一种新型的封闭式复发单元(GRU),称为单隧道GRU,用于异常检测。特别是,单个隧道的GRU从GRU单元中丢弃了重型重置门,这些重置门仅通过偏爱电流输入来获得优化的单封单元模型来忽略过去内容的重要性。此外,由于前者遭受了更深的网络的性能损失,我们将标准GRU中的双曲线切线激活代替标准GRU。经验结果表明,在大多数指标上,我们提出的优化GRU模型在CUHK Avenue和UCSD数据集上的大多数指标上都优于标准GRU和长期内存(LSTM)网络。该模型还可以在计算上有效,而在标准RNN上的训练时间减少和测试时间。

Abnormality detection is a challenging task due to the dependence on a specific context and the unconstrained variability of practical scenarios. In recent years, it has benefited from the powerful features learnt by deep neural networks, and handcrafted features specialized for abnormality detectors. However, these approaches with large complexity still have limitations in handling long term sequential data (e.g., videos), and their learnt features do not thoroughly capture useful information. Recurrent Neural Networks (RNNs) have been shown to be capable of robustly dealing with temporal data in long term sequences. In this paper, we propose a novel version of Gated Recurrent Unit (GRU), called Single Tunnelled GRU for abnormality detection. Particularly, the Single Tunnelled GRU discards the heavy weighted reset gate from GRU cells that overlooks the importance of past content by only favouring current input to obtain an optimized single gated cell model. Moreover, we substitute the hyperbolic tangent activation in standard GRUs with sigmoid activation, as the former suffers from performance loss in deeper networks. Empirical results show that our proposed optimized GRU model outperforms standard GRU and Long Short Term Memory (LSTM) networks on most metrics for detection and generalization tasks on CUHK Avenue and UCSD datasets. The model is also computationally efficient with reduced training and testing time over standard RNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源