论文标题

跨越方法的参数依赖性顺畅

A parameter-dependent smoother for the multigrid method

论文作者

Grasedyck, Lars, Klever, Maren, Löbbert, Christian, Werthmann, Tim A.

论文摘要

通过经典方法,参数依赖性线性系统的解决方案导致算术努力在参数数中呈指数增长。这使得多机方法具有广泛理解的融合理论,这是不可行的。参数依赖性表示,例如,低级张量格式可以避免这种指数依赖性,但是在这些指数中,如何直接计算代表中的反向是未知的。这些表示形式与跨部方法的组合需要经典的跨部理论的参数依赖性版本以及线性系统的参数依赖性表示形式,更平滑,延长和限制。平滑属性的派生参数依赖性版本,由Richardson和Jacobi方法的参数依赖性版本所实现,以及近似属性,证明了多级方法与任意参数相关表示的收敛性。对于模型问题,低级张量格式代表参数依赖性线性系统,延长和限制。通过使用指数总和直接以低级张量格式直接近似于阻尼的雅各比方法。证明这种近似的平滑属性确保了参数依赖性方法的收敛性。具有界参数值范围的参数依赖性模型问题的数值实验表明网格大小无关收敛速率。

The solution of parameter-dependent linear systems, by classical methods, leads to an arithmetic effort that grows exponentially in the number of parameters. This renders the multigrid method, which has a well understood convergence theory, infeasible. A parameter-dependent representation, e.g., a low-rank tensor format, can avoid this exponential dependence, but in these it is unknown how to calculate the inverse directly within the representation. The combination of these representations with the multigrid method requires a parameter-dependent version of the classical multigrid theory and a parameter-dependent representation of the linear system, the smoother, the prolongation and the restriction. A derived parameter-dependent version of the smoothing property, fulfilled by parameter-dependent versions of the Richardson and Jacobi methods, together with the approximation property prove the convergence of the multigrid method for arbitrary parameter-dependent representations. For a model problem low-rank tensor formats represent the parameter-dependent linear system, prolongation and restriction. The smoother, a damped Jacobi method, is directly approximated in the low-rank tensor format by using exponential sums. Proving the smoothing property for this approximation guarantees the convergence of the parameter-dependent method. Numerical experiments for the parameter-dependent model problem, with bounded parameter value range, indicate a grid size independent convergence rate.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源