论文标题

联合学习成本差异的物联网设备

Federated Learning Cost Disparity for IoT Devices

论文作者

Alvi, Sheeraz A., Hong, Yi, Durrani, Salman

论文摘要

联合学习(FL)通过在能源,时间和隐私方面避免数据收集成本来促进物联网(IoT)设备的预测模型培训。我们将IoT设备与其参与成本相对的实用程序建模。由于设备的杂质性,本地模型学习成本及其质量可能是随着时间变化的,而设备之间的不同之处在于。我们表明,这种变化会导致效用不公平,因为设备之间共享了相同的全局模型。默认情况下,主人不知道设备的本地模型计算和传输成本,因此无法解决效用不公平问题。此外,设备可能会利用大师的这种缺乏知识来有意减少其支出,从而增强其效用。我们建议根据其贡献和支出来控制与设备共享的全球模型的质量。这是通过根据学习贡献来减少全球模型泄露的差异隐私来实现的。此外,我们为每个设备设计自适应计算和传输策略,以控制其支出,以减轻效用不公平。我们的结果表明,与基准方案相比,提议的方案将设备的能源成本的标准偏差降低了99%,而设备的训练损失的标准偏差却在0.103左右变化。

Federated learning (FL) promotes predictive model training at the Internet of things (IoT) devices by evading data collection cost in terms of energy, time, and privacy. We model the learning gain achieved by an IoT device against its participation cost as its utility. Due to the device-heterogeneity, the local model learning cost and its quality, which can be time-varying, differs from device to device. We show that this variation results in utility unfairness because the same global model is shared among the devices. By default, the master is unaware of the local model computation and transmission costs of the devices, thus it is unable to address the utility unfairness problem. Also, a device may exploit this lack of knowledge at the master to intentionally reduce its expenditure and thereby enhance its utility. We propose to control the quality of the global model shared with the devices, in each round, based on their contribution and expenditure. This is achieved by employing differential privacy to curtail global model divulgence based on the learning contribution. In addition, we devise adaptive computation and transmission policies for each device to control its expenditure in order to mitigate utility unfairness. Our results show that the proposed scheme reduces the standard deviation of the energy cost of devices by 99% in comparison to the benchmark scheme, while the standard deviation of the training loss of devices varies around 0.103.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源