论文标题

简明的对数损失功能,用于强大的异常检测模型训练

Concise Logarithmic Loss Function for Robust Training of Anomaly Detection Model

论文作者

Park, YeongHyeon

论文摘要

最近,由于能够在没有或最小的域知识的情况下建立异常检测模型,因此基于深度学习的算法被广泛采用。相反,要更稳定地训练人造神经网络,定义适当的神经网络结构或损失函数应该更好。对于训练异常检测模型,平均平方误差(MSE)函数被广泛采用。另一方面,在本文中提出了新颖的损失函数,对数平均误差(LMSE),以更稳定神经网络。这项研究涵盖了数学比较的各种比较,在差异域中的可视化,用于反向传播的差异,训练过程中的损失收敛以及异常检测性能。总体而言,就损耗收敛性,异常检测性能而言,LMSE优于现有的MSE功能。 LMSE功能预计不仅适用于训练异常检测模型,还适用于一般生成神经网络。

Recently, deep learning-based algorithms are widely adopted due to the advantage of being able to establish anomaly detection models without or with minimal domain knowledge of the task. Instead, to train the artificial neural network more stable, it should be better to define the appropriate neural network structure or the loss function. For the training anomaly detection model, the mean squared error (MSE) function is adopted widely. On the other hand, the novel loss function, logarithmic mean squared error (LMSE), is proposed in this paper to train the neural network more stable. This study covers a variety of comparisons from mathematical comparisons, visualization in the differential domain for backpropagation, loss convergence in the training process, and anomaly detection performance. In an overall view, LMSE is superior to the existing MSE function in terms of strongness of loss convergence, anomaly detection performance. The LMSE function is expected to be applicable for training not only the anomaly detection model but also the general generative neural network.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源