论文标题

时间序列预测模型复制过去:如何减轻

Time Series Forecasting Models Copy the Past: How to Mitigate

论文作者

Kosma, Chrysoula, Nikolentzos, Giannis, Xu, Nancy, Vazirgiannis, Michalis

论文摘要

时间序列预测是重要的应用领域的核心,对机器学习算法构成了重大挑战。最近,神经网络架构已广泛应用于时间序列的预测问题。这些模型中的大多数都是通过最小化损失函数来衡量预测与实际值偏差的训练。典型的损耗函数包括均方根误差(MSE)和平均绝对误差(MAE)。在存在噪声和不确定性的情况下,神经网络模型倾向于复制时间序列的最后观察值,从而限制了它们对现实数据的适用性。在本文中,我们提供了上述问题的形式定义,我们还提供了观察到问题的预测的一些示例。我们还提出了一个正规化项,对先前看到的值的复制进行了惩罚。我们在合成数据集和现实世界数据集上评估了拟议的正规化项。我们的结果表明,正则术语在某种程度上减轻了上述问题,并引起了更健壮的模型。

Time series forecasting is at the core of important application domains posing significant challenges to machine learning algorithms. Recently neural network architectures have been widely applied to the problem of time series forecasting. Most of these models are trained by minimizing a loss function that measures predictions' deviation from the real values. Typical loss functions include mean squared error (MSE) and mean absolute error (MAE). In the presence of noise and uncertainty, neural network models tend to replicate the last observed value of the time series, thus limiting their applicability to real-world data. In this paper, we provide a formal definition of the above problem and we also give some examples of forecasts where the problem is observed. We also propose a regularization term penalizing the replication of previously seen values. We evaluate the proposed regularization term both on synthetic and real-world datasets. Our results indicate that the regularization term mitigates to some extent the aforementioned problem and gives rise to more robust models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源