论文标题

使用截短的损失重新访问健壮的模型拟合

Revisiting Robust Model Fitting Using Truncated Loss

论文作者

Wen, Fei, Wei, Hewen, Liu, Yipeng, Liu, Peilin

论文摘要

强大的拟合是低级视力中的一个基本问题,这通常是通过最大共识(MC)估计器来实现的,以首先或直接通过M估计量来识别嵌入体。尽管这两种方法在不同的应用中被区分优选,但基于损耗的M估计器与MC相似,因为它们也可以识别嵌入者。这项工作重新审视了一种使用截短损失同时实现同时嵌入式识别和模型估计(SIME)的公式。它具有适应线性和非线性残差模型的广义形式。我们表明,当Sime在查找嵌入式方面考虑了合适的剩余,其模型拟合中最低的残留物低于MC Robust Fitting。然后,采用交替的最小化(AM)算法来解决SIME公式。同时,开发了一种半限定释放(SDR)嵌入的AM算法,以减轻Sime配方的高非凸度。此外,新算法应用于各种2D/3D注册问题。实验结果表明,新算法在高离群比率下的表现明显优于lansac和确定性近似MC方法。此外,在旋转和欧几里得注册问题中,新算法还与最新的注册方法相比,尤其是在高噪声和离群值中。代码可在\ textit {https://github.com/fwen/mcme.git}中获得。

Robust fitting is a fundamental problem in low-level vision, which is typically achieved by maximum consensus (MC) estimators to identify inliers first or by M-estimators directly. While these two methods are discriminately preferred in different applications, truncated loss based M-estimators are similar to MC as they can also identify inliers. This work revisits a formulation that achieves simultaneous inlier identification and model estimation (SIME) using truncated loss. It has a generalized form adapts to both linear and nonlinear residual models. We show that as SIME takes fitting residual into account in finding inliers, its lowest achievable residual in model fitting is lower than that of MC robust fitting. Then, an alternating minimization (AM) algorithm is employed to solve the SIME formulation. Meanwhile, a semidefinite relaxation (SDR) embedded AM algorithm is developed in order to ease the high nonconvexity of the SIME formulation. Furthermore, the new algorithms are applied to various 2D/3D registration problems. Experimental results show that the new algorithms significantly outperform RANSAC and deterministic approximate MC methods at high outlier ratios. Besides, in rotation and Euclidean registration problems, the new algorithms also compare favorably with state-of-the-art registration methods, especially in high noise and outliers. Code is available at \textit{https://github.com/FWen/mcme.git}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源