论文标题

粗尺度替代物在数值均质化中的神经网络近似

Neural network approximation of coarse-scale surrogates in numerical homogenization

论文作者

Kröpfl, Fabian, Maier, Roland, Peterseim, Daniel

论文摘要

在与任意的粗糙扩散系数的线性椭圆形问题的数值均质化背景下,粗尺度的替代模型依赖于对局部亚域的高尺度亚问题的有效解决方案,然后使用其解决方案来推导对替代模型的适当粗略贡献。但是,在没有周期性和规模分离的情况下,此类模型的可靠性要求局部子域覆盖整个域,这可能会导致高离线成本,尤其是参数依赖性和随机问题。本文通过分析其近似属性来证明使用神经网络在粗尺度替代模型的近似中合理。对于局部正交分解方法,对于原型和代表性的数值均质化技术,我们表明,一个单个神经网络足以近似于所有发生系数依赖于系数的局部局部亚问题的粗糙贡献,用于非平凡类别的扩散系数,以达到任意准确性。我们在此网络的非零参数的深度和数量上介绍了严格的上限,以达到给定的准确性。此外,我们分析了所得神经网络增强的数值均质化替代模型的总体误差。

Coarse-scale surrogate models in the context of numerical homogenization of linear elliptic problems with arbitrary rough diffusion coefficients rely on the efficient solution of fine-scale sub-problems on local subdomains whose solutions are then employed to deduce appropriate coarse contributions to the surrogate model. However, in the absence of periodicity and scale separation, the reliability of such models requires the local subdomains to cover the whole domain which may result in high offline costs, in particular for parameter-dependent and stochastic problems. This paper justifies the use of neural networks for the approximation of coarse-scale surrogate models by analyzing their approximation properties. For a prototypical and representative numerical homogenization technique, the Localized Orthogonal Decomposition method, we show that one single neural network is sufficient to approximate the coarse contributions of all occurring coefficient-dependent local sub-problems for a non-trivial class of diffusion coefficients up to arbitrary accuracy. We present rigorous upper bounds on the depth and number of non-zero parameters for such a network to achieve a given accuracy. Further, we analyze the overall error of the resulting neural network enhanced numerical homogenization surrogate model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源