论文标题

编码的机器学习

Coded Machine Unlearning

论文作者

Aldaghri, Nasser, Mahdavifar, Hessam, Beirami, Ahmad

论文摘要

有些应用程序可能需要从系统中删除示例的轨迹,例如,用户请求其数据被删除或发现损坏的数据。简单地从存储单元中删除样品并不一定会删除其整个跟踪,因为下游机器学习模型可能会存储有关用于训练它们的样品的一些信息。如果我们从训练数据集中删除该样本的所有模型重新审阅所有模型,则可以完全删除样品。当预计将提供多个这样的未学习请求时,通过再培训来实现较高的贵学历。合奏学习使培训数据可以分为较小的分离碎片,这些碎片分配给了非交流的弱学习者。每个碎片用于产生一个弱模型。然后将这些模型聚合以产生最终的中心模型。这种设置引入了绩效和未学习成本之间的固有权衡,因为减少碎片尺寸会降低未学习成本,但可能会导致性能下降。在本文中,我们提出了一个编码的学习协议,我们利用线性编码器将培训数据编码为学习阶段之前的碎片。我们还介绍了相应的学习协议,并表明它满足了完美的学习标准。我们的实验结果表明,与未编码的基线相比,提议的编码机学业提供了更好的性能与未学习成本折衷。

There are applications that may require removing the trace of a sample from the system, e.g., a user requests their data to be deleted, or corrupted data is discovered. Simply removing a sample from storage units does not necessarily remove its entire trace since downstream machine learning models may store some information about the samples used to train them. A sample can be perfectly unlearned if we retrain all models that used it from scratch with that sample removed from their training dataset. When multiple such unlearning requests are expected to be served, unlearning by retraining becomes prohibitively expensive. Ensemble learning enables the training data to be split into smaller disjoint shards that are assigned to non-communicating weak learners. Each shard is used to produce a weak model. These models are then aggregated to produce the final central model. This setup introduces an inherent trade-off between performance and unlearning cost, as reducing the shard size reduces the unlearning cost but may cause degradation in performance. In this paper, we propose a coded learning protocol where we utilize linear encoders to encode the training data into shards prior to the learning phase. We also present the corresponding unlearning protocol and show that it satisfies the perfect unlearning criterion. Our experimental results show that the proposed coded machine unlearning provides a better performance versus unlearning cost trade-off compared to the uncoded baseline.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源