论文标题

具有更快的梯度变体的保存逻辑回归培训

Privacy-Preserving Logistic Regression Training with A Faster Gradient Variant

论文作者

Chiang, John

论文摘要

对加密数据的培训逻辑回归已成为解决安全问题的一种令人信服的方法。在本文中,我们引入了一种有效的梯度变体,称为$ QUADRATIC $ $渐变$,可用于保护隐私的逻辑回归培训。我们通过合并其二次梯度并评估了各种数据集上的这些改进的算法,从而增强了Nesterov的加速梯度(NAG),自适应梯度算法(Adagrad)和Adam算法。实验结果表明,与传统的一阶梯度方法相比,增强算法可显着提高收敛速度。此外,我们应用了增强的NAG方法来实施同型逻辑回归训练,仅在4次迭代中就可以实现可比的结果。二次梯度方法很有可能将一阶梯度下降/上升算法与二阶牛顿 - 拉夫森方法集成,并且可以将二次梯度应用于广泛的数值优化问题。

Training logistic regression over encrypted data has been a compelling approach in addressing security concerns for several years. In this paper, we introduce an efficient gradient variant, called $quadratic$ $gradient$, which can be used for privacy-preserving logistic regression training. We enhance Nesterov's Accelerated Gradient (NAG), Adaptive Gradient Algorithm (Adagrad) and Adam algorithms by incorporating their quadratic gradients and evaluate these improved algorithms on various datasets. Experimental results demonstrate that the enhanced algorithms achieve significantly improved convergence speed compared to traditional first-order gradient methods. Moreover, we applied the enhanced NAG method to implement homomorphic logistic regression training, achieving comparable results within just 4 iterations. There is a great chance that the quadratic gradient approach could integrate first-order gradient descent/ascent algorithms with the second-order Newton-Raphson methods, and that quadratic gradient could be applied to a wide range of numerical optimization problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源