论文标题

一般式burer-monteiro张量优化的本地和全球融合

Local and Global Convergence of General Burer-Monteiro Tensor Optimizations

论文作者

Li, Shuang, Li, Qiuwei

论文摘要

张量优化对于大规模的机器学习和信号处理任务至关重要。在本文中,我们考虑使用凸面且条件良好的目标函数进行张量优化,并使用Burer-Monteiro型参数化将其重新将其重构为非convex优化。我们分析了将香草梯度下降施加到有生命的公式的局部收敛性,并在轻度假设下建立局部规律性条件。我们还提供了梯度下降算法的线性收敛分析,始于真正的张量因子附近。与本地分析的补充,这项工作还表征了最佳排名一张量近似问题的全球几何形状,并证明,对于正交分解的张量,该问题没有虚假的局部最小值,所有鞍点都是严格的,除了零的零件是第三阶鞍点。

Tensor optimization is crucial to massive machine learning and signal processing tasks. In this paper, we consider tensor optimization with a convex and well-conditioned objective function and reformulate it into a nonconvex optimization using the Burer-Monteiro type parameterization. We analyze the local convergence of applying vanilla gradient descent to the factored formulation and establish a local regularity condition under mild assumptions. We also provide a linear convergence analysis of the gradient descent algorithm started in a neighborhood of the true tensor factors. Complementary to the local analysis, this work also characterizes the global geometry of the best rank-one tensor approximation problem and demonstrates that for orthogonally decomposable tensors the problem has no spurious local minima and all saddle points are strict except for the one at zero which is a third-order saddle point.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源