论文标题
通过深神经网络实时稀疏采样的ptychographic成像
Real-time sparse-sampled Ptychographic imaging through deep neural networks
论文作者
论文摘要
Ptychography在X射线和电子成像领域迅速生长,其前所未有的实现纳米或原子量表分辨率的能力,同时从样品中检索化学或磁性信息。通过解决一个复杂的反问题,可以实现Ptychographic的重建,该问题对获取和对数据的分析施加了约束,这通常由于解决此反问题所涉及的计算成本而无法实时成像。在这项工作中,我们提出了Ptychonn,这是一种基于深卷积神经网络的Ptychography重建问题的新方法。我们演示了如何使用所提出的方法来预测每个扫描点的真实空间结构和相位,仅从相应的远场衍射数据中。提出的结果表明,如何有效地将PTYCHONN用于实验数据,能够比一旦受过训练的最先进的PtyChography重建解决方案生成比最先进的PtyChograghy重建解决方案快的样品重建。通过超越基于迭代模型的方法的典型约束,我们可以显着放松数据采集条件并产生同样令人满意的重建。除了急剧加速的获取和分析外,在剂量敏感,动态和极大的样本的情况下,这种能力还可以实现以前无法实现的新成像场景。
Ptychography has rapidly grown in the fields of X-ray and electron imaging for its unprecedented ability to achieve nano or atomic scale resolution while simultaneously retrieving chemical or magnetic information from a sample. A ptychographic reconstruction is achieved by means of solving a complex inverse problem that imposes constraints both on the acquisition and on the analysis of the data, which typically precludes real-time imaging due to computational cost involved in solving this inverse problem. In this work we propose PtychoNN, a novel approach to solve the ptychography reconstruction problem based on deep convolutional neural networks. We demonstrate how the proposed method can be used to predict real-space structure and phase at each scan point solely from the corresponding far-field diffraction data. The presented results demonstrate how PtychoNN can effectively be used on experimental data, being able to generate high quality reconstructions of a sample up to hundreds of times faster than state-of-the-art ptychography reconstruction solutions once trained. By surpassing the typical constraints of iterative model-based methods, we can significantly relax the data acquisition sampling conditions and produce equally satisfactory reconstructions. Besides drastically accelerating acquisition and analysis, this capability can enable new imaging scenarios that were not possible before, in cases of dose sensitive, dynamic and extremely voluminous samples.