论文标题

联合培训的证明:负责任的跨网络模型培训和推理

Proof of Federated Training: Accountable Cross-Network Model Training and Inference

论文作者

Chakraborty, Sarthak, Chakraborty, Sandip

论文摘要

区块链已被广泛采用用于设计负责任的联邦学习框架;但是,现有的框架并未在多个独立区块链网络上进行分布式模型培训的扩展。为了将预训练的模型存储在区块链上,当前方法主要使用其结构属性嵌入模型,该模型既不可扩展用于跨链交换,也不适合跨链验证。本文提出了一个用于使用联合学习的跨链可验证模型培训的架构框架,称为联合培训证明(POFT),这是第一个使客户通过多个区块链网络跨客户进行联合培训程序跨度。 POFT不用结构嵌入,而是使用模型参数将模型嵌入区块链上,然后在两个区块链网络之间应用可验证的模型交换进行跨网络模型训练。我们使用亚马逊EC2实例在大规模设置上实施和测试POFT,并观察到跨链培训可以显着提高模型功效。相比之下,POFT造成了链间模型交换的边缘开销。

Blockchain has widely been adopted to design accountable federated learning frameworks; however, the existing frameworks do not scale for distributed model training over multiple independent blockchain networks. For storing the pre-trained models over blockchain, current approaches primarily embed a model using its structural properties that are neither scalable for cross-chain exchange nor suitable for cross-chain verification. This paper proposes an architectural framework for cross-chain verifiable model training using federated learning, called Proof of Federated Training (PoFT), the first of its kind that enables a federated training procedure span across the clients over multiple blockchain networks. Instead of structural embedding, PoFT uses model parameters to embed the model over a blockchain and then applies a verifiable model exchange between two blockchain networks for cross-network model training. We implement and test PoFT over a large-scale setup using Amazon EC2 instances and observe that cross-chain training can significantly boosts up the model efficacy. In contrast, PoFT incurs marginal overhead for inter-chain model exchanges.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源