论文标题

用于群体分布的近乎最佳算法在强大的优化范围内

Near-Optimal Algorithms for Group Distributionally Robust Optimization and Beyond

论文作者

Soma, Tasuku, Gatmiry, Khashayar, Gupta, Sharut, Jegelka, Stefanie

论文摘要

分布鲁棒优化(DRO)可以改善学习方法的鲁棒性和公平性。在本文中,我们设计了一类DRO问题的随机算法,包括集团DRO,亚群公平性和风险(CVAR)优化的经验条件价值。与多个DRO设置的现有算法相比,我们的新算法达到的收敛速率更快。我们还提供了一种新的信息理论下限,这意味着我们的界限对于群体DRO来说很紧。从经验上讲,我们的算法优于已知方法。

Distributionally robust optimization (DRO) can improve the robustness and fairness of learning methods. In this paper, we devise stochastic algorithms for a class of DRO problems including group DRO, subpopulation fairness, and empirical conditional value at risk (CVaR) optimization. Our new algorithms achieve faster convergence rates than existing algorithms for multiple DRO settings. We also provide a new information-theoretic lower bound that implies our bounds are tight for group DRO. Empirically, too, our algorithms outperform known methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源