论文标题

估计随机多部分量子状态的纠缠

Estimating the entanglement of random multipartite quantum states

论文作者

Fitter, Khurshed, Lancien, Cecilia, Nechita, Ion

论文摘要

可以通过其纠缠的几何量度来量化给定多部分纯量子状态的真正多部分纠缠,直到对数,它只是与乘积单位张量的相应单位张量的最大重叠,该数量也被称为张量的Indentive Norm。我们在这项工作中的一般目标是估计这种随机采样张量的注入性规范。为此,我们基于广泛使用的交替最小二乘法或新型的归一化梯度下降方法来研究和比较各种算法,并适用于对称或非对称的随机张量。我们首先在对称的真实高斯张量的情况下基准了它们各自的性能,其渐近平均注射范围在分析上是已知的。确定我们提出的归一化梯度下降算法通常会表现最佳,然后我们使用它来获得复杂高斯张量的平均注射范围(即达到归一化分布的纯量子态)的平均注射范围,或者,或者没有置换式置换率。最后,我们还估计了由高斯局部张量构建的随机矩阵乘积状态的平均注射范围,无论是否具有翻译不变性。所有这些结果构成了关于真正多部分纠缠的数量的第一个数值估计值,通常存在于随机多部分纯状态的各种模型中。

Genuine multipartite entanglement of a given multipartite pure quantum state can be quantified through its geometric measure of entanglement, which, up to logarithms, is simply the maximum overlap of the corresponding unit tensor with product unit tensors, a quantity that is also known as the injective norm of the tensor. Our general goal in this work is to estimate this injective norm of randomly sampled tensors. To this end, we study and compare various algorithms, based either on the widely used alternating least squares method or on a novel normalized gradient descent approach, and suited to either symmetrized or non-symmetrized random tensors. We first benchmark their respective performances on the case of symmetrized real Gaussian tensors, whose asymptotic average injective norm is known analytically. Having established that our proposed normalized gradient descent algorithm generally performs best, we then use it to obtain numerical estimates for the average injective norm of complex Gaussian tensors (i.e. up to normalization uniformly distributed multipartite pure quantum states), with or without permutation-invariance. Finally, we also estimate the average injective norm of random matrix product states constructed from Gaussian local tensors, with or without translation-invariance. All these results constitute the first numerical estimates on the amount of genuinely multipartite entanglement typically present in various models of random multipartite pure states.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源