论文标题
mqtransformer:多屈曲预测,依赖上下文和反馈意见的注意力
MQTransformer: Multi-Horizon Forecasts with Context Dependent and Feedback-Aware Attention
论文作者
论文摘要
神经预测的最新进展已在概率需求预测的准确性方面取得了重大改善。在这项工作中,我们通过结合了受到自然语言处理的变压器体系结构的最新进展的启发,提出了对当前艺术现象的新颖改进。我们开发了一种新颖的解码器编码器,以进行上下文对齐,从而通过允许网络根据其产生预测的上下文来研究其自身历史,从而提高了预测准确性。我们还提出了一种新颖的位置编码,该编码使神经网络可以学习与上下文有关的季节性功能以及任意的假期距离。最后,我们表明,最新的MQ-Forecaster(Wen等,2017)模型的当前状态通过未能利用预测中的先前错误来提高准确性而表现出多余的可变性。我们提出了一种新颖的解码器注意方案,用于预测,该方案在预测的过度变化方面产生了显着改善。
Recent advances in neural forecasting have produced major improvements in accuracy for probabilistic demand prediction. In this work, we propose novel improvements to the current state of the art by incorporating changes inspired by recent advances in Transformer architectures for Natural Language Processing. We develop a novel decoder-encoder attention for context-alignment, improving forecasting accuracy by allowing the network to study its own history based on the context for which it is producing a forecast. We also present a novel positional encoding that allows the neural network to learn context-dependent seasonality functions as well as arbitrary holiday distances. Finally we show that the current state of the art MQ-Forecaster (Wen et al., 2017) models display excess variability by failing to leverage previous errors in the forecast to improve accuracy. We propose a novel decoder-self attention scheme for forecasting that produces significant improvements in the excess variation of the forecast.