论文标题

迭代差异化私人学习算法的更严格的概括范围

Tighter Generalization Bounds for Iterative Differentially Private Learning Algorithms

论文作者

He, Fengxiang, Wang, Bohan, Tao, Dacheng

论文摘要

本文通过两个顺序的步骤研究了迭代学习算法中的概括与隐私保护之间的关系。我们首先为任何学习算法建立概括和隐私保护之间的一致性。我们证明$(\ varepsilon,δ)$ - 差异隐私意味着多数据库学习算法的一流概括,这进一步导致了任何学习算法的高概率。这种高概率的约束也意味着对差异私人学习算法的PAC学习保证。然后,我们研究大多数学习算法共享的迭代性质如何影响隐私保护和进一步的概括。提出了三个组成定理,以通过其每次迭代的差异隐私率近似任何迭代算法的差异隐私。通过整合上述两个步骤,我们最终为迭代学习算法提供了概括界限,这表明可以同时增强隐私保护和概括。我们的结果严格比现有作品更紧密。特别是,我们的概括范围并不依赖于深度学习中极大的模型大小。这阐明了了解深度学习的普遍性。这些结果适用于各种学习算法。在本文中,我们将它们应用于随机梯度Langevin动力学和不可知论的联合学习作为示例。

This paper studies the relationship between generalization and privacy preservation in iterative learning algorithms by two sequential steps. We first establish an alignment between generalization and privacy preservation for any learning algorithm. We prove that $(\varepsilon, δ)$-differential privacy implies an on-average generalization bound for multi-database learning algorithms which further leads to a high-probability bound for any learning algorithm. This high-probability bound also implies a PAC-learnable guarantee for differentially private learning algorithms. We then investigate how the iterative nature shared by most learning algorithms influence privacy preservation and further generalization. Three composition theorems are proposed to approximate the differential privacy of any iterative algorithm through the differential privacy of its every iteration. By integrating the above two steps, we eventually deliver generalization bounds for iterative learning algorithms, which suggest one can simultaneously enhance privacy preservation and generalization. Our results are strictly tighter than the existing works. Particularly, our generalization bounds do not rely on the model size which is prohibitively large in deep learning. This sheds light to understanding the generalizability of deep learning. These results apply to a wide spectrum of learning algorithms. In this paper, we apply them to stochastic gradient Langevin dynamics and agnostic federated learning as examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源