论文标题

分开学习而无需本地重量共享以增强客户端数据隐私

Split Learning without Local Weight Sharing to Enhance Client-side Data Privacy

论文作者

Pham, Ngoc Duy, Phan, Tran Khoa, Abuadbba, Alsharif, Gao, Yansong, Nguyen, Doan, Chilamkurti, Naveen

论文摘要

Split学习(SL)旨在通过在客户服务器之间分配深层模型并在本地保留私人数据来保护用户数据隐私。在与多个客户端的SL培训中,本地模型权重在客户端中共享以进行本地模型更新。本文首先揭示了通过模型反转攻击在SL中客户之间本地重量共享的数据隐私泄漏加剧。然后,为了减少数据隐私泄漏问题,我们建议和分析隐私增强的SL(P-SL)(或无局部重量共享)。我们进一步提出并行的P-SL来加快训练过程,通过复制多个服务器端模型实例而不会损害准确性。最后,我们与晚期参与的客户一起探索P-SL,并设计一种基于服务器端缓存的培训方法,以解决迟到客户加入时SL中的遗忘现象。实验结果表明,P-SL有助于减少客户端数据泄漏的50%,这实际上是通过使用差异隐私机制来实现比当前趋势更好的隐私性 - 准确性权衡。此外,P-SL及其基于缓存的版本在各种数据分布下达到了与基线SL的可比精度,而计算和通信的成本较小。此外,基于缓存的P-SL培训减轻了遗忘,稳定学习的负面影响,并在充满活力的环境中与晚期的客户进行了实用和低复杂的培训。

Split learning (SL) aims to protect user data privacy by distributing deep models between client-server and keeping private data locally. In SL training with multiple clients, the local model weights are shared among the clients for local model update. This paper first reveals data privacy leakage exacerbated from local weight sharing among the clients in SL through model inversion attacks. Then, to reduce the data privacy leakage issue, we propose and analyze privacy-enhanced SL (P-SL) (or SL without local weight sharing). We further propose parallelized P-SL to expedite the training process by duplicating multiple server-side model instances without compromising accuracy. Finally, we explore P-SL with late participating clients and devise a server-side cache-based training method to address the forgetting phenomenon in SL when late clients join. Experimental results demonstrate that P-SL helps reduce up to 50% of client-side data leakage, which essentially achieves a better privacy-accuracy trade-off than the current trend by using differential privacy mechanisms. Moreover, P-SL and its cache-based version achieve comparable accuracy to baseline SL under various data distributions, while cost less computation and communication. Additionally, caching-based training in P-SL mitigates the negative effect of forgetting, stabilizes the learning, and enables practical and low-complexity training in a dynamic environment with late-arriving clients.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源