论文标题

基于非侵入性负载监控的保存隐私的家庭负载预测:一种联合深度学习方法

Privacy-preserving household load forecasting based on non-intrusive load monitoring: A federated deep learning approach

论文作者

Zhou, Xinxin, Feng, Jingru, Wang, Jian, Pan, Jianhong

论文摘要

负载预测在电力系统的分析和网格计划中至关重要。因此,我们首先提出一种基于联邦深度学习和非侵入性负载监测(NILM)的家庭负载预测方法。就我们所知,这是基于NILM的家庭负载预测中有关联邦学习(FL)的首次研究。在这种方法中,通过非侵入性负载监视将集成功率分解为单个设备功率,并且使用联合深度学习模型分别预测单个设备的功率。最后,将单个设备的预测功率值聚合以形成总功率预测。具体而言,通过单独预测电气设备以获得预测的功率,它可以避免由于单个设备的功率信号的强烈依赖性而造成的误差。在联邦深度学习预测模型中,拥有权力数据的家主共享本地模型的参数,而不是本地电源数据,从而保证了家用用户数据的隐私。案例结果表明,所提出的方法比直接预测整个汇总信号的传统方法提供了更好的预测效果。此外,设计和实施了各种联合学习环境中的实验,以验证该方法的有效性。

Load forecasting is very essential in the analysis and grid planning of power systems. For this reason, we first propose a household load forecasting method based on federated deep learning and non-intrusive load monitoring (NILM). For all we know, this is the first research on federated learning (FL) in household load forecasting based on NILM. In this method, the integrated power is decomposed into individual device power by non-intrusive load monitoring, and the power of individual appliances is predicted separately using a federated deep learning model. Finally, the predicted power values of individual appliances are aggregated to form the total power prediction. Specifically, by separately predicting the electrical equipment to obtain the predicted power, it avoids the error caused by the strong time dependence in the power signal of a single device. And in the federated deep learning prediction model, the household owners with the power data share the parameters of the local model instead of the local power data, guaranteeing the privacy of the household user data. The case results demonstrate that the proposed approach provides a better prediction effect than the traditional methodology that directly predicts the aggregated signal as a whole. In addition, experiments in various federated learning environments are designed and implemented to validate the validity of this methodology.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源