论文标题

在无限维QR算法上

On the infinite-dimensional QR algorithm

论文作者

Colbrook, Matthew J., Hansen, Anders C.

论文摘要

无限二维操作员的光谱计算在科学中很困难,但无处不在。确实,尽管进行了半个多世纪的研究,但仍未知哪种类型的运营商允许计算具有收敛速率和误差控制的光谱和特征向量。将光谱问题的难度分类为复杂性层次结构的最新进展表明,最困难的光谱问题是如此困难,以至于一个人在计算中需要三个限制,并且没有收敛速率或可能的误差控制。这就提出了一个问题:哪些类别的操作员允许进行收敛率和错误控制的计算?在本文中,我们解决了这个基本问题,并且使用的算法是QR算法的无限尺寸版本。实际上,我们将QR算法概括为无限维操作员。我们证明,不仅可以在有限的机器上执行算法,而且还可以通过收敛率和误差控制恢复光谱的极端部分和相应的特征向量。这允许新的分类导致现有算法无法捕获的计算问题的层次结构。在与标准方法的比较的许多例子中证明了算法和收敛定理(臭名昭著的是提供虚假的解决方案)。我们还发现,在某些情况下,IQR算法的性能比理论预测的要好,并且对未来的研究做出了猜想。

Spectral computations of infinite-dimensional operators are notoriously difficult, yet ubiquitous in the sciences. Indeed, despite more than half a century of research, it is still unknown which classes of operators allow for computation of spectra and eigenvectors with convergence rates and error control. Recent progress in classifying the difficulty of spectral problems into complexity hierarchies has revealed that the most difficult spectral problems are so hard that one needs three limits in the computation, and no convergence rates nor error control is possible. This begs the question: which classes of operators allow for computations with convergence rates and error control? In this paper we address this basic question, and the algorithm used is an infinite-dimensional version of the QR algorithm. Indeed, we generalise the QR algorithm to infinite-dimensional operators. We prove that not only is the algorithm executable on a finite machine, but one can also recover the extremal parts of the spectrum and corresponding eigenvectors, with convergence rates and error control. This allows for new classification results in the hierarchy of computational problems that existing algorithms have not been able to capture. The algorithm and convergence theorems are demonstrated on a wealth of examples with comparisons to standard approaches (that are notorious for providing false solutions).We also find that in some cases the IQR algorithm performs better than predicted by theory and make conjectures for future study.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源