论文标题
地球物理反演的有趣属性
An Intriguing Property of Geophysics Inversion
论文作者
论文摘要
反转技术被广泛用于重建基于表面的地球物理测量值(例如,地震,电气/磁性(EM)数据)的地下物理特性(例如,速度,电导率)。这些问题受波浪或麦克斯韦方程等部分微分方程(PDE)的控制。解决地球物理反演问题由于不适当和高计算成本而具有挑战性。为了减轻这些问题,最近的研究利用深层神经网络来学习从测量到物业的倒置映射。在本文中,我们表明这样的映射可以通过仅有五层的非常浅(但不是宽)网络来很好地建模。这是根据我们对有趣属性的新发现来实现的:在高维空间中应用积分变换后,输入和输出之间的近线性关系。特别是,在处理由波方程控制的从地震数据到地下速度的反转时,与高斯核的速度的积分结果与正弦内核的地震数据的积分线性相关。此外,此属性可以轻松地转变为用于反转的轻重量编码器网络。编码器包含地震数据和线性转换的整合,而无需进行微调。解码器仅由一个单个变压器块组成,以逆转速度的积分。实验表明,在四个不同数据集上,这种有趣的属性适用于两个地球物理倒置问题。与更深的反转网络相比,我们的方法具有可比的精度,但消耗的参数大大减少。
Inversion techniques are widely used to reconstruct subsurface physical properties (e.g., velocity, conductivity) from surface-based geophysical measurements (e.g., seismic, electric/magnetic (EM) data). The problems are governed by partial differential equations (PDEs) like the wave or Maxwell's equations. Solving geophysical inversion problems is challenging due to the ill-posedness and high computational cost. To alleviate those issues, recent studies leverage deep neural networks to learn the inversion mappings from measurements to the property directly. In this paper, we show that such a mapping can be well modeled by a very shallow (but not wide) network with only five layers. This is achieved based on our new finding of an intriguing property: a near-linear relationship between the input and output, after applying integral transform in high dimensional space. In particular, when dealing with the inversion from seismic data to subsurface velocity governed by a wave equation, the integral results of velocity with Gaussian kernels are linearly correlated to the integral of seismic data with sine kernels. Furthermore, this property can be easily turned into a light-weight encoder-decoder network for inversion. The encoder contains the integration of seismic data and the linear transformation without need for fine-tuning. The decoder only consists of a single transformer block to reverse the integral of velocity. Experiments show that this interesting property holds for two geophysics inversion problems over four different datasets. Compared to much deeper InversionNet, our method achieves comparable accuracy, but consumes significantly fewer parameters.