论文标题
分销意见$ \ ell_1 $分析最小化
Distribution-aware $\ell_1$ Analysis Minimization
论文作者
论文摘要
这项工作是关于从采样不足的测量值中恢复分析sparse向量,即某些变换域中的稀疏矢量。在现实世界应用中,通常存在随机分析 - 分析向量,其在分析域中的分布是已知的。为了利用此信息,经常考虑加权$ \ ell_1 $分析最小化。但是,在这种情况下,选择权重的任务是具有挑战性的,而且是不平凡的。在这项工作中,我们提供了一种分析方法来选择合适的权重。具体而言,我们首先获得了预期的所需测量数量的紧密上限表达式。该界限取决于两个关键参数:支持分布和分析域的预期符号,这两者都可以预先访问。然后,我们通过最大程度地减少此表达相对于权重来计算近乎最佳的权重。我们的策略适用于嘈杂和嘈杂的环境。数值结果证明了我们提出的方法的优越性。具体而言,加权$ \ ell_1 $分析最小化的近乎最佳的加权设计比其常规$ \ ell_1 $分析对应物所需的测量值少。
This work is about recovering an analysis-sparse vector, i.e. sparse vector in some transform domain, from under-sampled measurements. In real-world applications, there often exist random analysis-sparse vectors whose distribution in the analysis domain are known. To exploit this information, a weighted $\ell_1$ analysis minimization is often considered. The task of choosing the weights in this case is however challenging and non-trivial. In this work, we provide an analytical method to choose the suitable weights. Specifically, we first obtain a tight upper-bound expression for the expected number of required measurements. This bound depends on two critical parameters: support distribution and expected sign of the analysis domain which are both accessible in advance. Then, we calculate the near-optimal weights by minimizing this expression with respect to the weights. Our strategy works for both noiseless and noisy settings. Numerical results demonstrate the superiority of our proposed method. Specifically, the weighted $\ell_1$ analysis minimization with our near-optimal weighting design considerably needs fewer measurements than its regular $\ell_1$ analysis counterpart.